mirror of
https://github.com/grafana/grafana.git
synced 2025-02-25 18:55:37 -06:00
Merge branch 'master' of github.com:grafana/grafana
This commit is contained in:
commit
f045fbf267
16
CHANGELOG.md
16
CHANGELOG.md
@ -2,14 +2,23 @@
|
|||||||
|
|
||||||
## Enhancements
|
## Enhancements
|
||||||
* **Alerting**: Added Telegram alert notifier [#7098](https://github.com/grafana/grafana/pull/7098), thx [@leonoff](https://github.com/leonoff)
|
* **Alerting**: Added Telegram alert notifier [#7098](https://github.com/grafana/grafana/pull/7098), thx [@leonoff](https://github.com/leonoff)
|
||||||
|
* **Templating**: Make $__interval and $__interval_ms global built in variables that can be used in by any datasource (in panel queries), closes [#7190](https://github.com/grafana/grafana/issues/7190), closes [#6582](https://github.com/grafana/grafana/issues/6582)
|
||||||
|
* **S3 Image Store**: External s3 image store (used in alert notifications) now support AWS IAM Roles, closes [#6985](https://github.com/grafana/grafana/issues/6985), [#7058](https://github.com/grafana/grafana/issues/7058) thx [@mtanda](https://github.com/mtanda)
|
||||||
|
|
||||||
# 4.0.0 (unreleased)
|
# 4.1.1 (2017-01-11)
|
||||||
|
|
||||||
|
### Bugfixes
|
||||||
|
* **Graph Panel**: Fixed issue with legend height in table mode [#7221](https://github.com/grafana/grafana/issues/7221)
|
||||||
|
|
||||||
|
# 4.1.0 (2017-01-11)
|
||||||
|
|
||||||
### Bugfixes
|
### Bugfixes
|
||||||
* **Server side PNG rendering**: Fixed issue with y-axis label rotation in phantomjs rendered images [#6924](https://github.com/grafana/grafana/issues/6924)
|
* **Server side PNG rendering**: Fixed issue with y-axis label rotation in phantomjs rendered images [#6924](https://github.com/grafana/grafana/issues/6924)
|
||||||
* **Graph**: Fixed centering of y-axis label [#7099](https://github.com/grafana/grafana/issues/7099)
|
* **Graph**: Fixed centering of y-axis label [#7099](https://github.com/grafana/grafana/issues/7099)
|
||||||
* **Graph**: Fixed graph legend table mode and always visible scrollbar [#6828](https://github.com/grafana/grafana/issues/6828)
|
* **Graph**: Fixed graph legend table mode and always visible scrollbar [#6828](https://github.com/grafana/grafana/issues/6828)
|
||||||
* **Templating**: Fixed template variable value groups/tags feature [#6752](https://github.com/grafana/grafana/issues/6752)
|
* **Templating**: Fixed template variable value groups/tags feature [#6752](https://github.com/grafana/grafana/issues/6752)
|
||||||
|
* **Webhook**: Fixed webhook username mismatch [#7195](https://github.com/grafana/grafana/pull/7195), thx [@theisenmark](https://github.com/theisenmark)
|
||||||
|
* **Influxdb**: Handles time(auto) the same way as time($interval) [#6997](https://github.com/grafana/grafana/issues/6997)
|
||||||
|
|
||||||
## Enhancements
|
## Enhancements
|
||||||
* **Elasticsearch**: Added support for all moving average options [#7154](https://github.com/grafana/grafana/pull/7154), thx [@vaibhavinbayarea](https://github.com/vaibhavinbayarea)
|
* **Elasticsearch**: Added support for all moving average options [#7154](https://github.com/grafana/grafana/pull/7154), thx [@vaibhavinbayarea](https://github.com/vaibhavinbayarea)
|
||||||
@ -42,11 +51,6 @@
|
|||||||
* **Notifications**: Remove html escaping the email subject. [#6905](https://github.com/grafana/grafana/issues/6905)
|
* **Notifications**: Remove html escaping the email subject. [#6905](https://github.com/grafana/grafana/issues/6905)
|
||||||
* **Influxdb**: Fixes broken field dropdown when using template vars as measurement. [#6473](https://github.com/grafana/grafana/issues/6473)
|
* **Influxdb**: Fixes broken field dropdown when using template vars as measurement. [#6473](https://github.com/grafana/grafana/issues/6473)
|
||||||
|
|
||||||
# 4.0.3 (unreleased)
|
|
||||||
|
|
||||||
### Bugfixes
|
|
||||||
* **Influxdb**: Handles time(auto) the same way as time($interval) [#6997](https://github.com/grafana/grafana/issues/6997)
|
|
||||||
|
|
||||||
# 4.0.2 (2016-12-08)
|
# 4.0.2 (2016-12-08)
|
||||||
|
|
||||||
### Enhancements
|
### Enhancements
|
||||||
|
@ -18,7 +18,7 @@ Graphite, Elasticsearch, OpenTSDB, Prometheus and InfluxDB.
|
|||||||
- [What's New in Grafana 2.5](http://docs.grafana.org/guides/whats-new-in-v2-5/)
|
- [What's New in Grafana 2.5](http://docs.grafana.org/guides/whats-new-in-v2-5/)
|
||||||
- [What's New in Grafana 3.0](http://docs.grafana.org/guides/whats-new-in-v3/)
|
- [What's New in Grafana 3.0](http://docs.grafana.org/guides/whats-new-in-v3/)
|
||||||
- [What's New in Grafana 4.0](http://docs.grafana.org/guides/whats-new-in-v4/)
|
- [What's New in Grafana 4.0](http://docs.grafana.org/guides/whats-new-in-v4/)
|
||||||
- [What's New in Grafana 4.1 beta](http://docs.grafana.org/guides/whats-new-in-v4-1/)
|
- [What's New in Grafana 4.1](http://docs.grafana.org/guides/whats-new-in-v4-1/)
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
### Graphite Target Editor
|
### Graphite Target Editor
|
||||||
|
@ -5,7 +5,7 @@ os: Windows Server 2012 R2
|
|||||||
clone_folder: c:\gopath\src\github.com\grafana\grafana
|
clone_folder: c:\gopath\src\github.com\grafana\grafana
|
||||||
|
|
||||||
environment:
|
environment:
|
||||||
nodejs_version: "5"
|
nodejs_version: "6"
|
||||||
GOPATH: c:\gopath
|
GOPATH: c:\gopath
|
||||||
|
|
||||||
install:
|
install:
|
||||||
|
30
docs/sources/administration/cli.md
Normal file
30
docs/sources/administration/cli.md
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
+++
|
||||||
|
title = "Grafana CLI"
|
||||||
|
description = "Guide to using grafana-cli"
|
||||||
|
keywords = ["grafana", "cli", "grafana-cli", "command line interface"]
|
||||||
|
type = "docs"
|
||||||
|
[menu.docs]
|
||||||
|
parent = "admin"
|
||||||
|
weight = 8
|
||||||
|
+++
|
||||||
|
|
||||||
|
# Grafana CLI
|
||||||
|
|
||||||
|
Grafana cli is a small executable that is bundled with grafana server and is suppose to be executed on the same machine as grafana runs.
|
||||||
|
|
||||||
|
## Plugins
|
||||||
|
|
||||||
|
The CLI helps you install, upgrade and manage your plugins on the same machine it CLI is running. You can find more information about how to install and manage your plugins at the [plugin page] ({{< relref "/installation.md" >}})
|
||||||
|
|
||||||
|
## Admin
|
||||||
|
|
||||||
|
> This feature is only available in grafana 4.1 and above.
|
||||||
|
|
||||||
|
To show all admin commands:
|
||||||
|
`grafana-cli admin`
|
||||||
|
|
||||||
|
### Reset admin password
|
||||||
|
|
||||||
|
You can reset the password for the admin user using the CLI.
|
||||||
|
|
||||||
|
`grafana-cli admin reset-admin-password ...`
|
@ -1,17 +1,17 @@
|
|||||||
+++
|
+++
|
||||||
title = "What's New in Grafana v4.1 beta"
|
title = "What's New in Grafana v4.1"
|
||||||
description = "Feature & improvement highlights for Grafana v4.1 beta"
|
description = "Feature & improvement highlights for Grafana v4.1"
|
||||||
keywords = ["grafana", "new", "documentation", "4.1.0-beta1"]
|
keywords = ["grafana", "new", "documentation", "4.1.0"]
|
||||||
type = "docs"
|
type = "docs"
|
||||||
[menu.docs]
|
[menu.docs]
|
||||||
name = "Version 4.1 beta"
|
name = "Version 4.1"
|
||||||
identifier = "v4.1"
|
identifier = "v4.1"
|
||||||
parent = "whatsnew"
|
parent = "whatsnew"
|
||||||
weight = -1
|
weight = -1
|
||||||
+++
|
+++
|
||||||
|
|
||||||
|
|
||||||
## Whats new in Grafana v4.1 beta
|
## Whats new in Grafana v4.1
|
||||||
- **Graph**: Support for shared tooltip on all graphs as you hover over one graph. [#1578](https://github.com/grafana/grafana/pull/1578), [#6274](https://github.com/grafana/grafana/pull/6274)
|
- **Graph**: Support for shared tooltip on all graphs as you hover over one graph. [#1578](https://github.com/grafana/grafana/pull/1578), [#6274](https://github.com/grafana/grafana/pull/6274)
|
||||||
- **Victorops**: Add VictorOps notification integration [#6411](https://github.com/grafana/grafana/issues/6411), thx [@ichekrygin](https://github.com/ichekrygin)
|
- **Victorops**: Add VictorOps notification integration [#6411](https://github.com/grafana/grafana/issues/6411), thx [@ichekrygin](https://github.com/ichekrygin)
|
||||||
- **Opsgenie**: Add OpsGenie notification integratiion [#6687](https://github.com/grafana/grafana/issues/6687), thx [@kylemcc](https://github.com/kylemcc)
|
- **Opsgenie**: Add OpsGenie notification integratiion [#6687](https://github.com/grafana/grafana/issues/6687), thx [@kylemcc](https://github.com/kylemcc)
|
||||||
@ -55,7 +55,7 @@ Once the `access key` and `secret key` have been saved the user will no longer b
|
|||||||
|
|
||||||
## Upgrade & Breaking changes
|
## Upgrade & Breaking changes
|
||||||
|
|
||||||
Elasticsearch 1.x is no longer supported. Please upgrade to Elasticsearch 2.x or 5.x. Otherwise Grafana 4.1.0-beta1 contains no breaking changes.
|
Elasticsearch 1.x is no longer supported. Please upgrade to Elasticsearch 2.x or 5.x. Otherwise Grafana 4.1.0 contains no breaking changes.
|
||||||
|
|
||||||
## Changelog
|
## Changelog
|
||||||
|
|
||||||
|
@ -275,3 +275,20 @@ Change password for specific user
|
|||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
|
|
||||||
{message: "User deleted"}
|
{message: "User deleted"}
|
||||||
|
|
||||||
|
## Pause all alerts
|
||||||
|
|
||||||
|
`DELETE /api/admin/pause-all-alerts`
|
||||||
|
|
||||||
|
**Example Request**:
|
||||||
|
|
||||||
|
DELETE /api/admin/pause-all-alerts HTTP/1.1
|
||||||
|
Accept: application/json
|
||||||
|
Content-Type: application/json
|
||||||
|
|
||||||
|
**Example Response**:
|
||||||
|
|
||||||
|
HTTP/1.1 200
|
||||||
|
Content-Type: application/json
|
||||||
|
|
||||||
|
{state: "new state", message: "alerts pause/un paused", "alertsAffected": 100}
|
||||||
|
@ -23,7 +23,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
GET /api/alerts HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -52,7 +52,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
GET /api/alerts/1 HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -80,7 +80,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
POST /api/alerts/1/pause HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -106,7 +106,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
GET /api/alert-notifications HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -131,7 +131,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
POST /api/alerts-notifications HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -140,8 +140,8 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
"name": "new alert notification", //Required
|
"name": "new alert notification", //Required
|
||||||
"type": "email", //Required
|
"type": "email", //Required
|
||||||
"isDefault": false,
|
"isDefault": false,
|
||||||
"settings": {
|
"settings": {
|
||||||
"addresses: "carl@grafana.com;dev@grafana.com"
|
"addresses": "carl@grafana.com;dev@grafana.com"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -166,7 +166,7 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
PUT /api/alerts-notifications/1 HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
@ -198,11 +198,11 @@ This API can also be used to create, update and delete alert notifications.
|
|||||||
|
|
||||||
## Delete alert notification
|
## Delete alert notification
|
||||||
|
|
||||||
`DELETE /api/alerts-notifications/1`
|
`DELETE /api/alerts-notifications/:notificationId`
|
||||||
|
|
||||||
**Example Request**:
|
**Example Request**:
|
||||||
|
|
||||||
GET /api/org HTTP/1.1
|
DELETE /api/alerts-notifications/1 HTTP/1.1
|
||||||
Accept: application/json
|
Accept: application/json
|
||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||||
|
@ -15,23 +15,14 @@ weight = 1
|
|||||||
|
|
||||||
Description | Download
|
Description | Download
|
||||||
------------ | -------------
|
------------ | -------------
|
||||||
Stable for Debian-based Linux | [4.0.2 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.2-1481203731_amd64.deb)
|
Stable for Debian-based Linux | [4.1.0 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_4.1.0-1484127817_amd64.deb)
|
||||||
Latest beta for Debian-based Linux | [4.1.0-beta1 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_4.1.0-1482230757beta1_amd64.deb)
|
|
||||||
|
|
||||||
## Install Stable
|
## Install Stable
|
||||||
|
|
||||||
```
|
```
|
||||||
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.2-1481203731_amd64.deb
|
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_4.1.0-1484127817_amd64.deb
|
||||||
$ sudo apt-get install -y adduser libfontconfig
|
$ sudo apt-get install -y adduser libfontconfig
|
||||||
$ sudo dpkg -i grafana_4.0.2-1481203731_amd64.deb
|
$ sudo dpkg -i grafana_4.1.0-1484127817_amd64.deb
|
||||||
```
|
|
||||||
|
|
||||||
## Install Latest Beta
|
|
||||||
|
|
||||||
```
|
|
||||||
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_4.1.0-1482230757beta1_amd64.deb
|
|
||||||
$ sudo apt-get install -y adduser libfontconfig
|
|
||||||
$ sudo dpkg -i grafana_4.1.0-1482230757beta1_amd64.deb
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## APT Repository
|
## APT Repository
|
||||||
|
@ -15,40 +15,24 @@ weight = 2
|
|||||||
|
|
||||||
Description | Download
|
Description | Download
|
||||||
------------ | -------------
|
------------ | -------------
|
||||||
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [4.0.2 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.2-1481203731.x86_64.rpm)
|
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [4.1.0 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0-1484127817.x86_64.rpm)
|
||||||
Latest beta for CentOS / Fedora / OpenSuse / Redhat Linux | [4.1.0-beta1 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0-1482230757beta1.x86_64.rpm)
|
|
||||||
|
|
||||||
## Install Stable
|
## Install Stable
|
||||||
|
|
||||||
You can install Grafana using Yum directly.
|
You can install Grafana using Yum directly.
|
||||||
|
|
||||||
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.2-1481203731.x86_64.rpm
|
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0-1484127817.x86_64.rpm
|
||||||
|
|
||||||
Or install manually using `rpm`.
|
Or install manually using `rpm`.
|
||||||
|
|
||||||
#### On CentOS / Fedora / Redhat:
|
#### On CentOS / Fedora / Redhat:
|
||||||
|
|
||||||
$ sudo yum install initscripts fontconfig
|
$ sudo yum install initscripts fontconfig
|
||||||
$ sudo rpm -Uvh grafana-4.0.2-1481203731.x86_64.rpm
|
$ sudo rpm -Uvh grafana-4.1.0-1484127817.x86_64.rpm
|
||||||
|
|
||||||
#### On OpenSuse:
|
#### On OpenSuse:
|
||||||
|
|
||||||
$ sudo rpm -i --nodeps grafana-4.0.2-1481203731.x86_64.rpm
|
$ sudo rpm -i --nodeps grafana-4.1.0-1484127817.x86_64.rpm
|
||||||
|
|
||||||
## Or Install Latest Beta
|
|
||||||
|
|
||||||
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0-1482230757beta1.x86_64.rpm
|
|
||||||
|
|
||||||
Or install manually using `rpm`.
|
|
||||||
|
|
||||||
#### On CentOS / Fedora / Redhat:
|
|
||||||
|
|
||||||
$ sudo yum install initscripts fontconfig
|
|
||||||
$ sudo rpm -Uvh grafana-4.1.0-1482230757beta1.x86_64.rpm
|
|
||||||
|
|
||||||
#### On OpenSuse:
|
|
||||||
|
|
||||||
$ sudo rpm -i --nodeps grafana-4.1.0-1482230757beta1.x86_64.rpm
|
|
||||||
|
|
||||||
## Install via YUM Repository
|
## Install via YUM Repository
|
||||||
|
|
||||||
|
@ -13,8 +13,7 @@ weight = 3
|
|||||||
|
|
||||||
Description | Download
|
Description | Download
|
||||||
------------ | -------------
|
------------ | -------------
|
||||||
Latest stable package for Windows | [grafana.4.0.2.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.2.windows-x64.zip)
|
Latest stable package for Windows | [grafana.4.1.0.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0.windows-x64.zip)
|
||||||
Latest beta package for Windows | [grafana.4.1.0-beta1.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.0-beta1.windows-x64.zip)
|
|
||||||
|
|
||||||
## Configure
|
## Configure
|
||||||
|
|
||||||
|
@ -78,7 +78,7 @@
|
|||||||
"sinon": "1.17.6",
|
"sinon": "1.17.6",
|
||||||
"systemjs-builder": "^0.15.34",
|
"systemjs-builder": "^0.15.34",
|
||||||
"tether": "^1.4.0",
|
"tether": "^1.4.0",
|
||||||
"tether-drop": "git://github.com/torkelo/drop",
|
"tether-drop": "https://github.com/torkelo/drop",
|
||||||
"tslint": "^4.0.2",
|
"tslint": "^4.0.2",
|
||||||
"typescript": "^2.1.4",
|
"typescript": "^2.1.4",
|
||||||
"virtual-scroll": "^1.1.1"
|
"virtual-scroll": "^1.1.1"
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#! /usr/bin/env bash
|
#! /usr/bin/env bash
|
||||||
deb_ver=4.0.2-1481203731
|
deb_ver=4.1.0-1484127817
|
||||||
rpm_ver=4.0.2-1481203731
|
rpm_ver=4.1.0-1484127817
|
||||||
|
|
||||||
wget https://grafanarel.s3.amazonaws.com/builds/grafana_${deb_ver}_amd64.deb
|
wget https://grafanarel.s3.amazonaws.com/builds/grafana_${deb_ver}_amd64.deb
|
||||||
|
|
||||||
|
@ -2,6 +2,7 @@ package imguploader
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/setting"
|
"github.com/grafana/grafana/pkg/setting"
|
||||||
)
|
)
|
||||||
@ -30,19 +31,21 @@ func NewImageUploader() (ImageUploader, error) {
|
|||||||
accessKey := s3sec.Key("access_key").MustString("")
|
accessKey := s3sec.Key("access_key").MustString("")
|
||||||
secretKey := s3sec.Key("secret_key").MustString("")
|
secretKey := s3sec.Key("secret_key").MustString("")
|
||||||
|
|
||||||
if bucket == "" {
|
region := ""
|
||||||
|
rBucket := regexp.MustCompile(`https?:\/\/(.*)\.s3(-([^.]+))?\.amazonaws\.com\/?`)
|
||||||
|
matches := rBucket.FindStringSubmatch(bucket)
|
||||||
|
if len(matches) == 0 {
|
||||||
return nil, fmt.Errorf("Could not find bucket setting for image.uploader.s3")
|
return nil, fmt.Errorf("Could not find bucket setting for image.uploader.s3")
|
||||||
|
} else {
|
||||||
|
bucket = matches[1]
|
||||||
|
if matches[3] != "" {
|
||||||
|
region = matches[3]
|
||||||
|
} else {
|
||||||
|
region = "us-east-1"
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if accessKey == "" {
|
return NewS3Uploader(region, bucket, "public-read", accessKey, secretKey), nil
|
||||||
return nil, fmt.Errorf("Could not find accessKey setting for image.uploader.s3")
|
|
||||||
}
|
|
||||||
|
|
||||||
if secretKey == "" {
|
|
||||||
return nil, fmt.Errorf("Could not find secretKey setting for image.uploader.s3")
|
|
||||||
}
|
|
||||||
|
|
||||||
return NewS3Uploader(bucket, accessKey, secretKey), nil
|
|
||||||
case "webdav":
|
case "webdav":
|
||||||
webdavSec, err := setting.Cfg.GetSection("external_image_storage.webdav")
|
webdavSec, err := setting.Cfg.GetSection("external_image_storage.webdav")
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -19,7 +19,7 @@ func TestImageUploaderFactory(t *testing.T) {
|
|||||||
setting.ImageUploadProvider = "s3"
|
setting.ImageUploadProvider = "s3"
|
||||||
|
|
||||||
s3sec, err := setting.Cfg.GetSection("external_image_storage.s3")
|
s3sec, err := setting.Cfg.GetSection("external_image_storage.s3")
|
||||||
s3sec.NewKey("bucket_url", "bucket_url")
|
s3sec.NewKey("bucket_url", "https://foo.bar.baz.s3-us-east-2.amazonaws.com")
|
||||||
s3sec.NewKey("access_key", "access_key")
|
s3sec.NewKey("access_key", "access_key")
|
||||||
s3sec.NewKey("secret_key", "secret_key")
|
s3sec.NewKey("secret_key", "secret_key")
|
||||||
|
|
||||||
@ -29,9 +29,10 @@ func TestImageUploaderFactory(t *testing.T) {
|
|||||||
original, ok := uploader.(*S3Uploader)
|
original, ok := uploader.(*S3Uploader)
|
||||||
|
|
||||||
So(ok, ShouldBeTrue)
|
So(ok, ShouldBeTrue)
|
||||||
|
So(original.region, ShouldEqual, "us-east-2")
|
||||||
|
So(original.bucket, ShouldEqual, "foo.bar.baz")
|
||||||
So(original.accessKey, ShouldEqual, "access_key")
|
So(original.accessKey, ShouldEqual, "access_key")
|
||||||
So(original.secretKey, ShouldEqual, "secret_key")
|
So(original.secretKey, ShouldEqual, "secret_key")
|
||||||
So(original.bucket, ShouldEqual, "bucket_url")
|
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("Webdav uploader", func() {
|
Convey("Webdav uploader", func() {
|
||||||
|
@ -1,26 +1,33 @@
|
|||||||
package imguploader
|
package imguploader
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"io/ioutil"
|
"os"
|
||||||
"net/http"
|
"time"
|
||||||
"net/url"
|
|
||||||
"path"
|
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/credentials"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/credentials/ec2rolecreds"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/ec2metadata"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/session"
|
||||||
|
"github.com/aws/aws-sdk-go/service/s3"
|
||||||
"github.com/grafana/grafana/pkg/log"
|
"github.com/grafana/grafana/pkg/log"
|
||||||
"github.com/grafana/grafana/pkg/util"
|
"github.com/grafana/grafana/pkg/util"
|
||||||
"github.com/kr/s3/s3util"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
type S3Uploader struct {
|
type S3Uploader struct {
|
||||||
|
region string
|
||||||
bucket string
|
bucket string
|
||||||
|
acl string
|
||||||
secretKey string
|
secretKey string
|
||||||
accessKey string
|
accessKey string
|
||||||
log log.Logger
|
log log.Logger
|
||||||
}
|
}
|
||||||
|
|
||||||
func NewS3Uploader(bucket, accessKey, secretKey string) *S3Uploader {
|
func NewS3Uploader(region, bucket, acl, accessKey, secretKey string) *S3Uploader {
|
||||||
return &S3Uploader{
|
return &S3Uploader{
|
||||||
|
region: region,
|
||||||
bucket: bucket,
|
bucket: bucket,
|
||||||
|
acl: acl,
|
||||||
accessKey: accessKey,
|
accessKey: accessKey,
|
||||||
secretKey: secretKey,
|
secretKey: secretKey,
|
||||||
log: log.New("s3uploader"),
|
log: log.New("s3uploader"),
|
||||||
@ -28,42 +35,41 @@ func NewS3Uploader(bucket, accessKey, secretKey string) *S3Uploader {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (u *S3Uploader) Upload(imageDiskPath string) (string, error) {
|
func (u *S3Uploader) Upload(imageDiskPath string) (string, error) {
|
||||||
|
sess := session.New()
|
||||||
s3util.DefaultConfig.AccessKey = u.accessKey
|
creds := credentials.NewChainCredentials(
|
||||||
s3util.DefaultConfig.SecretKey = u.secretKey
|
[]credentials.Provider{
|
||||||
|
&credentials.StaticProvider{Value: credentials.Value{
|
||||||
header := make(http.Header)
|
AccessKeyID: u.accessKey,
|
||||||
header.Add("x-amz-acl", "public-read")
|
SecretAccessKey: u.secretKey,
|
||||||
header.Add("Content-Type", "image/png")
|
}},
|
||||||
|
&credentials.EnvProvider{},
|
||||||
var imageUrl *url.URL
|
&ec2rolecreds.EC2RoleProvider{Client: ec2metadata.New(sess), ExpiryWindow: 5 * time.Minute},
|
||||||
var err error
|
})
|
||||||
|
cfg := &aws.Config{
|
||||||
if imageUrl, err = url.Parse(u.bucket); err != nil {
|
Region: aws.String(u.region),
|
||||||
return "", err
|
Credentials: creds,
|
||||||
}
|
}
|
||||||
|
|
||||||
// add image to url
|
key := util.GetRandomString(20) + ".png"
|
||||||
imageUrl.Path = path.Join(imageUrl.Path, util.GetRandomString(20)+".png")
|
log.Debug("Uploading image to s3", "bucket = ", u.bucket, ", key = ", key)
|
||||||
imageUrlString := imageUrl.String()
|
|
||||||
log.Debug("Uploading image to s3", "url", imageUrlString)
|
|
||||||
|
|
||||||
writer, err := s3util.Create(imageUrlString, header, nil)
|
file, err := os.Open(imageDiskPath)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return "", err
|
return "", err
|
||||||
}
|
}
|
||||||
|
|
||||||
defer writer.Close()
|
svc := s3.New(session.New(cfg), cfg)
|
||||||
|
params := &s3.PutObjectInput{
|
||||||
imgData, err := ioutil.ReadFile(imageDiskPath)
|
Bucket: aws.String(u.bucket),
|
||||||
|
Key: aws.String(key),
|
||||||
|
ACL: aws.String(u.acl),
|
||||||
|
Body: file,
|
||||||
|
ContentType: aws.String("image/png"),
|
||||||
|
}
|
||||||
|
_, err = svc.PutObject(params)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return "", err
|
return "", err
|
||||||
}
|
}
|
||||||
|
|
||||||
_, err = writer.Write(imgData)
|
return "https://" + u.bucket + ".s3.amazonaws.com/" + key, nil
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
|
|
||||||
return imageUrlString, nil
|
|
||||||
}
|
}
|
||||||
|
@ -48,6 +48,7 @@ var (
|
|||||||
M_Alerting_Notification_Sent_Victorops Counter
|
M_Alerting_Notification_Sent_Victorops Counter
|
||||||
M_Alerting_Notification_Sent_OpsGenie Counter
|
M_Alerting_Notification_Sent_OpsGenie Counter
|
||||||
M_Alerting_Notification_Sent_Telegram Counter
|
M_Alerting_Notification_Sent_Telegram Counter
|
||||||
|
M_Alerting_Notification_Sent_Sensu Counter
|
||||||
M_Aws_CloudWatch_GetMetricStatistics Counter
|
M_Aws_CloudWatch_GetMetricStatistics Counter
|
||||||
M_Aws_CloudWatch_ListMetrics Counter
|
M_Aws_CloudWatch_ListMetrics Counter
|
||||||
|
|
||||||
@ -116,6 +117,7 @@ func initMetricVars(settings *MetricSettings) {
|
|||||||
M_Alerting_Notification_Sent_Victorops = RegCounter("alerting.notifications_sent", "type", "victorops")
|
M_Alerting_Notification_Sent_Victorops = RegCounter("alerting.notifications_sent", "type", "victorops")
|
||||||
M_Alerting_Notification_Sent_OpsGenie = RegCounter("alerting.notifications_sent", "type", "opsgenie")
|
M_Alerting_Notification_Sent_OpsGenie = RegCounter("alerting.notifications_sent", "type", "opsgenie")
|
||||||
M_Alerting_Notification_Sent_Telegram = RegCounter("alerting.notifications_sent", "type", "telegram")
|
M_Alerting_Notification_Sent_Telegram = RegCounter("alerting.notifications_sent", "type", "telegram")
|
||||||
|
M_Alerting_Notification_Sent_Sensu = RegCounter("alerting.notifications_sent", "type", "sensu")
|
||||||
|
|
||||||
M_Aws_CloudWatch_GetMetricStatistics = RegCounter("aws.cloudwatch.get_metric_statistics")
|
M_Aws_CloudWatch_GetMetricStatistics = RegCounter("aws.cloudwatch.get_metric_statistics")
|
||||||
M_Aws_CloudWatch_ListMetrics = RegCounter("aws.cloudwatch.list_metrics")
|
M_Aws_CloudWatch_ListMetrics = RegCounter("aws.cloudwatch.list_metrics")
|
||||||
|
115
pkg/services/alerting/notifiers/sensu.go
Normal file
115
pkg/services/alerting/notifiers/sensu.go
Normal file
@ -0,0 +1,115 @@
|
|||||||
|
package notifiers
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/grafana/grafana/pkg/bus"
|
||||||
|
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||||
|
"github.com/grafana/grafana/pkg/log"
|
||||||
|
"github.com/grafana/grafana/pkg/metrics"
|
||||||
|
m "github.com/grafana/grafana/pkg/models"
|
||||||
|
"github.com/grafana/grafana/pkg/services/alerting"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
alerting.RegisterNotifier(&alerting.NotifierPlugin{
|
||||||
|
Type: "sensu",
|
||||||
|
Name: "Sensu",
|
||||||
|
Description: "Sends HTTP POST request to a Sensu API",
|
||||||
|
Factory: NewSensuNotifier,
|
||||||
|
OptionsTemplate: `
|
||||||
|
<h3 class="page-heading">Sensu settings</h3>
|
||||||
|
<div class="gf-form">
|
||||||
|
<span class="gf-form-label width-10">Url</span>
|
||||||
|
<input type="text" required class="gf-form-input max-width-26" ng-model="ctrl.model.settings.url" placeholder="http://sensu-api.local:4567/results"></input>
|
||||||
|
</div>
|
||||||
|
<div class="gf-form">
|
||||||
|
<span class="gf-form-label width-10">Username</span>
|
||||||
|
<input type="text" class="gf-form-input max-width-14" ng-model="ctrl.model.settings.username"></input>
|
||||||
|
</div>
|
||||||
|
<div class="gf-form">
|
||||||
|
<span class="gf-form-label width-10">Password</span>
|
||||||
|
<input type="text" class="gf-form-input max-width-14" ng-model="ctrl.model.settings.password"></input>
|
||||||
|
</div>
|
||||||
|
`,
|
||||||
|
})
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewSensuNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
|
||||||
|
url := model.Settings.Get("url").MustString()
|
||||||
|
if url == "" {
|
||||||
|
return nil, alerting.ValidationError{Reason: "Could not find url property in settings"}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &SensuNotifier{
|
||||||
|
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||||
|
Url: url,
|
||||||
|
User: model.Settings.Get("username").MustString(),
|
||||||
|
Password: model.Settings.Get("password").MustString(),
|
||||||
|
log: log.New("alerting.notifier.sensu"),
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
type SensuNotifier struct {
|
||||||
|
NotifierBase
|
||||||
|
Url string
|
||||||
|
User string
|
||||||
|
Password string
|
||||||
|
log log.Logger
|
||||||
|
}
|
||||||
|
|
||||||
|
func (this *SensuNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||||
|
this.log.Info("Sending sensu result")
|
||||||
|
metrics.M_Alerting_Notification_Sent_Sensu.Inc(1)
|
||||||
|
|
||||||
|
bodyJSON := simplejson.New()
|
||||||
|
bodyJSON.Set("ruleId", evalContext.Rule.Id)
|
||||||
|
// Sensu alerts cannot have spaces in them
|
||||||
|
bodyJSON.Set("name", strings.Replace(evalContext.Rule.Name, " ", "_", -1))
|
||||||
|
// Sensu alerts require a command
|
||||||
|
// We set it to the grafana ruleID
|
||||||
|
bodyJSON.Set("source", "grafana_rule_"+strconv.FormatInt(evalContext.Rule.Id, 10))
|
||||||
|
// Finally, sensu expects an output
|
||||||
|
// We set it to a default output
|
||||||
|
bodyJSON.Set("output", "Grafana Metric Condition Met")
|
||||||
|
bodyJSON.Set("evalMatches", evalContext.EvalMatches)
|
||||||
|
|
||||||
|
if evalContext.Rule.State == "alerting" {
|
||||||
|
bodyJSON.Set("status", 2)
|
||||||
|
} else if evalContext.Rule.State == "no_data" {
|
||||||
|
bodyJSON.Set("status", 1)
|
||||||
|
} else {
|
||||||
|
bodyJSON.Set("status", 0)
|
||||||
|
}
|
||||||
|
|
||||||
|
ruleUrl, err := evalContext.GetRuleUrl()
|
||||||
|
if err == nil {
|
||||||
|
bodyJSON.Set("ruleUrl", ruleUrl)
|
||||||
|
}
|
||||||
|
|
||||||
|
if evalContext.ImagePublicUrl != "" {
|
||||||
|
bodyJSON.Set("imageUrl", evalContext.ImagePublicUrl)
|
||||||
|
}
|
||||||
|
|
||||||
|
if evalContext.Rule.Message != "" {
|
||||||
|
bodyJSON.Set("message", evalContext.Rule.Message)
|
||||||
|
}
|
||||||
|
|
||||||
|
body, _ := bodyJSON.MarshalJSON()
|
||||||
|
|
||||||
|
cmd := &m.SendWebhookSync{
|
||||||
|
Url: this.Url,
|
||||||
|
User: this.User,
|
||||||
|
Password: this.Password,
|
||||||
|
Body: string(body),
|
||||||
|
HttpMethod: "POST",
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {
|
||||||
|
this.log.Error("Failed to send sensu event", "error", err, "sensu", this.Name)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
52
pkg/services/alerting/notifiers/sensu_test.go
Normal file
52
pkg/services/alerting/notifiers/sensu_test.go
Normal file
@ -0,0 +1,52 @@
|
|||||||
|
package notifiers
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||||
|
m "github.com/grafana/grafana/pkg/models"
|
||||||
|
. "github.com/smartystreets/goconvey/convey"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestSensuNotifier(t *testing.T) {
|
||||||
|
Convey("Sensu notifier tests", t, func() {
|
||||||
|
|
||||||
|
Convey("Parsing alert notification from settings", func() {
|
||||||
|
Convey("empty settings should return error", func() {
|
||||||
|
json := `{ }`
|
||||||
|
|
||||||
|
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||||
|
model := &m.AlertNotification{
|
||||||
|
Name: "sensu",
|
||||||
|
Type: "sensu",
|
||||||
|
Settings: settingsJSON,
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := NewSensuNotifier(model)
|
||||||
|
So(err, ShouldNotBeNil)
|
||||||
|
})
|
||||||
|
|
||||||
|
Convey("from settings", func() {
|
||||||
|
json := `
|
||||||
|
{
|
||||||
|
"url": "http://sensu-api.example.com:4567/results"
|
||||||
|
}`
|
||||||
|
|
||||||
|
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||||
|
model := &m.AlertNotification{
|
||||||
|
Name: "sensu",
|
||||||
|
Type: "sensu",
|
||||||
|
Settings: settingsJSON,
|
||||||
|
}
|
||||||
|
|
||||||
|
not, err := NewSensuNotifier(model)
|
||||||
|
sensuNotifier := not.(*SensuNotifier)
|
||||||
|
|
||||||
|
So(err, ShouldBeNil)
|
||||||
|
So(sensuNotifier.Name, ShouldEqual, "sensu")
|
||||||
|
So(sensuNotifier.Type, ShouldEqual, "sensu")
|
||||||
|
So(sensuNotifier.Url, ShouldEqual, "http://sensu-api.example.com:4567/results")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
@ -50,7 +50,7 @@ func NewWebHookNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
|
|||||||
return &WebhookNotifier{
|
return &WebhookNotifier{
|
||||||
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||||
Url: url,
|
Url: url,
|
||||||
User: model.Settings.Get("user").MustString(),
|
User: model.Settings.Get("username").MustString(),
|
||||||
Password: model.Settings.Get("password").MustString(),
|
Password: model.Settings.Get("password").MustString(),
|
||||||
HttpMethod: model.Settings.Get("httpMethod").MustString("POST"),
|
HttpMethod: model.Settings.Get("httpMethod").MustString("POST"),
|
||||||
log: log.New("alerting.notifier.webhook"),
|
log: log.New("alerting.notifier.webhook"),
|
||||||
|
@ -14,9 +14,10 @@ import (
|
|||||||
"runtime"
|
"runtime"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/go-macaron/session"
|
|
||||||
"gopkg.in/ini.v1"
|
"gopkg.in/ini.v1"
|
||||||
|
|
||||||
|
"github.com/go-macaron/session"
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/log"
|
"github.com/grafana/grafana/pkg/log"
|
||||||
"github.com/grafana/grafana/pkg/util"
|
"github.com/grafana/grafana/pkg/util"
|
||||||
)
|
)
|
||||||
|
@ -2,7 +2,9 @@ package influxdb
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
"regexp"
|
"regexp"
|
||||||
|
|
||||||
@ -15,24 +17,53 @@ var (
|
|||||||
)
|
)
|
||||||
|
|
||||||
func (query *Query) Build(queryContext *tsdb.QueryContext) (string, error) {
|
func (query *Query) Build(queryContext *tsdb.QueryContext) (string, error) {
|
||||||
|
var res string
|
||||||
|
|
||||||
if query.UseRawQuery && query.RawQuery != "" {
|
if query.UseRawQuery && query.RawQuery != "" {
|
||||||
q := query.RawQuery
|
res = query.RawQuery
|
||||||
|
} else {
|
||||||
q = strings.Replace(q, "$timeFilter", query.renderTimeFilter(queryContext), 1)
|
res = query.renderSelectors(queryContext)
|
||||||
q = strings.Replace(q, "$interval", tsdb.CalculateInterval(queryContext.TimeRange), 1)
|
res += query.renderMeasurement()
|
||||||
|
res += query.renderWhereClause()
|
||||||
return q, nil
|
res += query.renderTimeFilter(queryContext)
|
||||||
|
res += query.renderGroupBy(queryContext)
|
||||||
}
|
}
|
||||||
|
|
||||||
res := query.renderSelectors(queryContext)
|
interval, err := getDefinedInterval(query, queryContext)
|
||||||
res += query.renderMeasurement()
|
if err != nil {
|
||||||
res += query.renderWhereClause()
|
return "", err
|
||||||
res += query.renderTimeFilter(queryContext)
|
}
|
||||||
res += query.renderGroupBy(queryContext)
|
|
||||||
|
|
||||||
|
res = strings.Replace(res, "$timeFilter", query.renderTimeFilter(queryContext), 1)
|
||||||
|
res = strings.Replace(res, "$interval", interval.Text, 1)
|
||||||
|
res = strings.Replace(res, "$__interval_ms", strconv.FormatInt(interval.Value.Nanoseconds()/int64(time.Millisecond), 10), 1)
|
||||||
|
res = strings.Replace(res, "$__interval", interval.Text, 1)
|
||||||
return res, nil
|
return res, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func getDefinedInterval(query *Query, queryContext *tsdb.QueryContext) (*tsdb.Interval, error) {
|
||||||
|
defaultInterval := tsdb.CalculateInterval(queryContext.TimeRange)
|
||||||
|
|
||||||
|
if query.Interval == "" {
|
||||||
|
return &defaultInterval, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
setInterval := strings.Replace(strings.Replace(query.Interval, "<", "", 1), ">", "", 1)
|
||||||
|
parsedSetInterval, err := time.ParseDuration(setInterval)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.Contains(query.Interval, ">") {
|
||||||
|
if defaultInterval.Value > parsedSetInterval {
|
||||||
|
return &defaultInterval, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &tsdb.Interval{Value: parsedSetInterval, Text: setInterval}, nil
|
||||||
|
}
|
||||||
|
|
||||||
func (query *Query) renderTags() []string {
|
func (query *Query) renderTags() []string {
|
||||||
var res []string
|
var res []string
|
||||||
for i, tag := range query.Tags {
|
for i, tag := range query.Tags {
|
||||||
|
@ -3,7 +3,6 @@ package influxdb
|
|||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/tsdb"
|
"github.com/grafana/grafana/pkg/tsdb"
|
||||||
)
|
)
|
||||||
@ -93,30 +92,10 @@ func fieldRenderer(query *Query, queryContext *tsdb.QueryContext, part *QueryPar
|
|||||||
return fmt.Sprintf(`"%s"`, part.Params[0])
|
return fmt.Sprintf(`"%s"`, part.Params[0])
|
||||||
}
|
}
|
||||||
|
|
||||||
func getDefinedInterval(query *Query, queryContext *tsdb.QueryContext) string {
|
|
||||||
setInterval := strings.Replace(strings.Replace(query.Interval, "<", "", 1), ">", "", 1)
|
|
||||||
defaultInterval := tsdb.CalculateInterval(queryContext.TimeRange)
|
|
||||||
|
|
||||||
if strings.Contains(query.Interval, ">") {
|
|
||||||
parsedDefaultInterval, err := time.ParseDuration(defaultInterval)
|
|
||||||
parsedSetInterval, err2 := time.ParseDuration(setInterval)
|
|
||||||
|
|
||||||
if err == nil && err2 == nil && parsedDefaultInterval > parsedSetInterval {
|
|
||||||
return defaultInterval
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return setInterval
|
|
||||||
}
|
|
||||||
|
|
||||||
func functionRenderer(query *Query, queryContext *tsdb.QueryContext, part *QueryPart, innerExpr string) string {
|
func functionRenderer(query *Query, queryContext *tsdb.QueryContext, part *QueryPart, innerExpr string) string {
|
||||||
for i, param := range part.Params {
|
for i, param := range part.Params {
|
||||||
if param == "$interval" || param == "auto" {
|
if part.Type == "time" && param == "auto" {
|
||||||
if query.Interval != "" {
|
part.Params[i] = "$__interval"
|
||||||
part.Params[i] = getDefinedInterval(query, queryContext)
|
|
||||||
} else {
|
|
||||||
part.Params[i] = tsdb.CalculateInterval(queryContext.TimeRange)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -42,7 +42,7 @@ func TestInfluxdbQueryPart(t *testing.T) {
|
|||||||
So(err, ShouldBeNil)
|
So(err, ShouldBeNil)
|
||||||
|
|
||||||
res := part.Render(query, queryContext, "")
|
res := part.Render(query, queryContext, "")
|
||||||
So(res, ShouldEqual, "time(200ms)")
|
So(res, ShouldEqual, "time($interval)")
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("render time with auto", func() {
|
Convey("render time with auto", func() {
|
||||||
@ -50,28 +50,7 @@ func TestInfluxdbQueryPart(t *testing.T) {
|
|||||||
So(err, ShouldBeNil)
|
So(err, ShouldBeNil)
|
||||||
|
|
||||||
res := part.Render(query, queryContext, "")
|
res := part.Render(query, queryContext, "")
|
||||||
So(res, ShouldEqual, "time(200ms)")
|
So(res, ShouldEqual, "time($__interval)")
|
||||||
})
|
|
||||||
|
|
||||||
Convey("render time interval >10s", func() {
|
|
||||||
part, err := NewQueryPart("time", []string{"$interval"})
|
|
||||||
So(err, ShouldBeNil)
|
|
||||||
|
|
||||||
query.Interval = ">10s"
|
|
||||||
|
|
||||||
res := part.Render(query, queryContext, "")
|
|
||||||
So(res, ShouldEqual, "time(10s)")
|
|
||||||
})
|
|
||||||
|
|
||||||
Convey("render time interval >1s and higher interval calculation", func() {
|
|
||||||
part, err := NewQueryPart("time", []string{"$interval"})
|
|
||||||
queryContext := &tsdb.QueryContext{TimeRange: tsdb.NewTimeRange("1y", "now")}
|
|
||||||
So(err, ShouldBeNil)
|
|
||||||
|
|
||||||
query.Interval = ">1s"
|
|
||||||
|
|
||||||
res := part.Render(query, queryContext, "")
|
|
||||||
So(res, ShouldEqual, "time(168h)")
|
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("render spread", func() {
|
Convey("render spread", func() {
|
||||||
|
@ -16,10 +16,15 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
|||||||
qp1, _ := NewQueryPart("field", []string{"value"})
|
qp1, _ := NewQueryPart("field", []string{"value"})
|
||||||
qp2, _ := NewQueryPart("mean", []string{})
|
qp2, _ := NewQueryPart("mean", []string{})
|
||||||
|
|
||||||
groupBy1, _ := NewQueryPart("time", []string{"$interval"})
|
mathPartDivideBy100, _ := NewQueryPart("math", []string{"/ 100"})
|
||||||
|
mathPartDivideByIntervalMs, _ := NewQueryPart("math", []string{"/ $__interval_ms"})
|
||||||
|
|
||||||
|
groupBy1, _ := NewQueryPart("time", []string{"$__interval"})
|
||||||
groupBy2, _ := NewQueryPart("tag", []string{"datacenter"})
|
groupBy2, _ := NewQueryPart("tag", []string{"datacenter"})
|
||||||
groupBy3, _ := NewQueryPart("fill", []string{"null"})
|
groupBy3, _ := NewQueryPart("fill", []string{"null"})
|
||||||
|
|
||||||
|
groupByOldInterval, _ := NewQueryPart("time", []string{"$interval"})
|
||||||
|
|
||||||
tag1 := &Tag{Key: "hostname", Value: "server1", Operator: "="}
|
tag1 := &Tag{Key: "hostname", Value: "server1", Operator: "="}
|
||||||
tag2 := &Tag{Key: "hostname", Value: "server2", Operator: "=", Condition: "OR"}
|
tag2 := &Tag{Key: "hostname", Value: "server2", Operator: "=", Condition: "OR"}
|
||||||
|
|
||||||
@ -55,6 +60,43 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
|||||||
So(rawQuery, ShouldEqual, `SELECT mean("value") FROM "cpu" WHERE "hostname" = 'server1' OR "hostname" = 'server2' AND time > now() - 5m GROUP BY time(5s), "datacenter" fill(null)`)
|
So(rawQuery, ShouldEqual, `SELECT mean("value") FROM "cpu" WHERE "hostname" = 'server1' OR "hostname" = 'server2' AND time > now() - 5m GROUP BY time(5s), "datacenter" fill(null)`)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
Convey("can build query with math part", func() {
|
||||||
|
query := &Query{
|
||||||
|
Selects: []*Select{{*qp1, *qp2, *mathPartDivideBy100}},
|
||||||
|
Measurement: "cpu",
|
||||||
|
Interval: "5s",
|
||||||
|
}
|
||||||
|
|
||||||
|
rawQuery, err := query.Build(queryContext)
|
||||||
|
So(err, ShouldBeNil)
|
||||||
|
So(rawQuery, ShouldEqual, `SELECT mean("value") / 100 FROM "cpu" WHERE time > now() - 5m`)
|
||||||
|
})
|
||||||
|
|
||||||
|
Convey("can build query with math part using $__interval_ms variable", func() {
|
||||||
|
query := &Query{
|
||||||
|
Selects: []*Select{{*qp1, *qp2, *mathPartDivideByIntervalMs}},
|
||||||
|
Measurement: "cpu",
|
||||||
|
Interval: "5s",
|
||||||
|
}
|
||||||
|
|
||||||
|
rawQuery, err := query.Build(queryContext)
|
||||||
|
So(err, ShouldBeNil)
|
||||||
|
So(rawQuery, ShouldEqual, `SELECT mean("value") / 5000 FROM "cpu" WHERE time > now() - 5m`)
|
||||||
|
})
|
||||||
|
|
||||||
|
Convey("can build query with old $interval variable", func() {
|
||||||
|
query := &Query{
|
||||||
|
Selects: []*Select{{*qp1, *qp2}},
|
||||||
|
Measurement: "cpu",
|
||||||
|
Policy: "",
|
||||||
|
GroupBy: []*QueryPart{groupByOldInterval},
|
||||||
|
}
|
||||||
|
|
||||||
|
rawQuery, err := query.Build(queryContext)
|
||||||
|
So(err, ShouldBeNil)
|
||||||
|
So(rawQuery, ShouldEqual, `SELECT mean("value") FROM "cpu" WHERE time > now() - 5m GROUP BY time(200ms)`)
|
||||||
|
})
|
||||||
|
|
||||||
Convey("can render time range", func() {
|
Convey("can render time range", func() {
|
||||||
query := Query{}
|
query := Query{}
|
||||||
Convey("render from: 2h to now-1h", func() {
|
Convey("render from: 2h to now-1h", func() {
|
||||||
@ -139,4 +181,5 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
|||||||
So(query.renderMeasurement(), ShouldEqual, ` FROM "policy"./apa/`)
|
So(query.renderMeasurement(), ShouldEqual, ` FROM "policy"./apa/`)
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -12,14 +12,19 @@ var (
|
|||||||
day time.Duration = time.Hour * 24 * 365
|
day time.Duration = time.Hour * 24 * 365
|
||||||
)
|
)
|
||||||
|
|
||||||
func CalculateInterval(timerange *TimeRange) string {
|
type Interval struct {
|
||||||
|
Text string
|
||||||
|
Value time.Duration
|
||||||
|
}
|
||||||
|
|
||||||
|
func CalculateInterval(timerange *TimeRange) Interval {
|
||||||
interval := time.Duration((timerange.MustGetTo().UnixNano() - timerange.MustGetFrom().UnixNano()) / defaultRes)
|
interval := time.Duration((timerange.MustGetTo().UnixNano() - timerange.MustGetFrom().UnixNano()) / defaultRes)
|
||||||
|
|
||||||
if interval < minInterval {
|
if interval < minInterval {
|
||||||
return formatDuration(minInterval)
|
return Interval{Text: formatDuration(minInterval), Value: interval}
|
||||||
}
|
}
|
||||||
|
|
||||||
return formatDuration(roundInterval(interval))
|
return Interval{Text: formatDuration(roundInterval(interval)), Value: interval}
|
||||||
}
|
}
|
||||||
|
|
||||||
func formatDuration(inter time.Duration) string {
|
func formatDuration(inter time.Duration) string {
|
||||||
|
@ -18,28 +18,28 @@ func TestInterval(t *testing.T) {
|
|||||||
tr := NewTimeRange("5m", "now")
|
tr := NewTimeRange("5m", "now")
|
||||||
|
|
||||||
interval := CalculateInterval(tr)
|
interval := CalculateInterval(tr)
|
||||||
So(interval, ShouldEqual, "200ms")
|
So(interval.Text, ShouldEqual, "200ms")
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("for 15min", func() {
|
Convey("for 15min", func() {
|
||||||
tr := NewTimeRange("15m", "now")
|
tr := NewTimeRange("15m", "now")
|
||||||
|
|
||||||
interval := CalculateInterval(tr)
|
interval := CalculateInterval(tr)
|
||||||
So(interval, ShouldEqual, "500ms")
|
So(interval.Text, ShouldEqual, "500ms")
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("for 30min", func() {
|
Convey("for 30min", func() {
|
||||||
tr := NewTimeRange("30m", "now")
|
tr := NewTimeRange("30m", "now")
|
||||||
|
|
||||||
interval := CalculateInterval(tr)
|
interval := CalculateInterval(tr)
|
||||||
So(interval, ShouldEqual, "1s")
|
So(interval.Text, ShouldEqual, "1s")
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("for 1h", func() {
|
Convey("for 1h", func() {
|
||||||
tr := NewTimeRange("1h", "now")
|
tr := NewTimeRange("1h", "now")
|
||||||
|
|
||||||
interval := CalculateInterval(tr)
|
interval := CalculateInterval(tr)
|
||||||
So(interval, ShouldEqual, "2s")
|
So(interval.Text, ShouldEqual, "2s")
|
||||||
})
|
})
|
||||||
|
|
||||||
Convey("Round interval", func() {
|
Convey("Round interval", func() {
|
||||||
|
@ -89,7 +89,7 @@ export function functionRenderer(part, innerExpr) {
|
|||||||
var paramType = part.def.params[index];
|
var paramType = part.def.params[index];
|
||||||
if (paramType.type === 'time') {
|
if (paramType.type === 'time') {
|
||||||
if (value === 'auto') {
|
if (value === 'auto') {
|
||||||
value = '$interval';
|
value = '$__interval';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (paramType.quote === 'single') {
|
if (paramType.quote === 'single') {
|
||||||
|
@ -191,6 +191,13 @@ class MetricsPanelCtrl extends PanelCtrl {
|
|||||||
return this.$q.when([]);
|
return this.$q.when([]);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// make shallow copy of scoped vars,
|
||||||
|
// and add built in variables interval and interval_ms
|
||||||
|
var scopedVars = Object.assign({}, this.panel.scopedVars, {
|
||||||
|
"__interval": {text: this.interval, value: this.interval},
|
||||||
|
"__interval_ms": {text: this.intervalMs, value: this.intervalMs},
|
||||||
|
});
|
||||||
|
|
||||||
var metricsQuery = {
|
var metricsQuery = {
|
||||||
panelId: this.panel.id,
|
panelId: this.panel.id,
|
||||||
range: this.range,
|
range: this.range,
|
||||||
@ -200,7 +207,7 @@ class MetricsPanelCtrl extends PanelCtrl {
|
|||||||
targets: this.panel.targets,
|
targets: this.panel.targets,
|
||||||
format: this.panel.renderer === 'png' ? 'png' : 'json',
|
format: this.panel.renderer === 'png' ? 'png' : 'json',
|
||||||
maxDataPoints: this.resolution,
|
maxDataPoints: this.resolution,
|
||||||
scopedVars: this.panel.scopedVars,
|
scopedVars: scopedVars,
|
||||||
cacheTimeout: this.panel.cacheTimeout
|
cacheTimeout: this.panel.cacheTimeout
|
||||||
};
|
};
|
||||||
|
|
||||||
|
@ -150,6 +150,11 @@ describe('templateSrv', function() {
|
|||||||
expect(result).to.be('test,build=test2');
|
expect(result).to.be('test,build=test2');
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('multi value and distributed should render when not string', function() {
|
||||||
|
var result = _templateSrv.formatValue(['test'], 'distributed', { name: 'build' });
|
||||||
|
expect(result).to.be('test');
|
||||||
|
});
|
||||||
|
|
||||||
it('slash should be properly escaped in regex format', function() {
|
it('slash should be properly escaped in regex format', function() {
|
||||||
var result = _templateSrv.formatValue('Gi3/14', 'regex');
|
var result = _templateSrv.formatValue('Gi3/14', 'regex');
|
||||||
expect(result).to.be('Gi3\\/14');
|
expect(result).to.be('Gi3\\/14');
|
||||||
@ -239,4 +244,16 @@ describe('templateSrv', function() {
|
|||||||
expect(target).to.be('Server: All, period: 13m');
|
expect(target).to.be('Server: All, period: 13m');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('built in interval variables', function() {
|
||||||
|
beforeEach(function() {
|
||||||
|
initTemplateSrv([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should replace $__interval_ms with interval milliseconds', function() {
|
||||||
|
var target = _templateSrv.replace('10 * $__interval_ms', {"__interval_ms": {text: "100", value: "100"}});
|
||||||
|
expect(target).to.be('10 * 100');
|
||||||
|
});
|
||||||
|
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
@ -17,6 +17,11 @@ function (angular, _, kbn) {
|
|||||||
this._grafanaVariables = {};
|
this._grafanaVariables = {};
|
||||||
this._adhocVariables = {};
|
this._adhocVariables = {};
|
||||||
|
|
||||||
|
// default built ins
|
||||||
|
this._builtIns = {};
|
||||||
|
this._builtIns['__interval'] = {text: '1s', value: '1s'};
|
||||||
|
this._builtIns['__interval_ms'] = {text: '100', value: '100'};
|
||||||
|
|
||||||
this.init = function(variables) {
|
this.init = function(variables) {
|
||||||
this.variables = variables;
|
this.variables = variables;
|
||||||
this.updateTemplateData();
|
this.updateTemplateData();
|
||||||
@ -42,6 +47,7 @@ function (angular, _, kbn) {
|
|||||||
|
|
||||||
this._index[variable.name] = variable;
|
this._index[variable.name] = variable;
|
||||||
}
|
}
|
||||||
|
|
||||||
};
|
};
|
||||||
|
|
||||||
this.variableInitialized = function(variable) {
|
this.variableInitialized = function(variable) {
|
||||||
@ -97,13 +103,16 @@ function (angular, _, kbn) {
|
|||||||
return value.join('|');
|
return value.join('|');
|
||||||
}
|
}
|
||||||
case "distributed": {
|
case "distributed": {
|
||||||
return this.distributeVariable(value, variable.name);
|
|
||||||
}
|
|
||||||
default: {
|
|
||||||
if (typeof value === 'string') {
|
if (typeof value === 'string') {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
return '{' + value.join(',') + '}';
|
return this.distributeVariable(value, variable.name);
|
||||||
|
}
|
||||||
|
default: {
|
||||||
|
if (_.isArray(value)) {
|
||||||
|
return '{' + value.join(',') + '}';
|
||||||
|
}
|
||||||
|
return value;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
@ -132,7 +141,7 @@ function (angular, _, kbn) {
|
|||||||
str = _.escape(str);
|
str = _.escape(str);
|
||||||
this._regex.lastIndex = 0;
|
this._regex.lastIndex = 0;
|
||||||
return str.replace(this._regex, function(match, g1, g2) {
|
return str.replace(this._regex, function(match, g1, g2) {
|
||||||
if (self._index[g1 || g2]) {
|
if (self._index[g1 || g2] || self._builtIns[g1 || g2]) {
|
||||||
return '<span class="template-variable">' + match + '</span>';
|
return '<span class="template-variable">' + match + '</span>';
|
||||||
}
|
}
|
||||||
return match;
|
return match;
|
||||||
|
@ -1,58 +1,17 @@
|
|||||||
{
|
{
|
||||||
"revision": 6,
|
|
||||||
"title": "TestData - Graph Panel Last 1h",
|
|
||||||
"tags": [
|
|
||||||
"grafana-test"
|
|
||||||
],
|
|
||||||
"style": "dark",
|
|
||||||
"timezone": "browser",
|
|
||||||
"editable": true,
|
|
||||||
"sharedCrosshair": false,
|
|
||||||
"hideControls": false,
|
|
||||||
"time": {
|
|
||||||
"from": "2016-11-16T16:59:38.294Z",
|
|
||||||
"to": "2016-11-16T17:09:01.532Z"
|
|
||||||
},
|
|
||||||
"timepicker": {
|
|
||||||
"refresh_intervals": [
|
|
||||||
"5s",
|
|
||||||
"10s",
|
|
||||||
"30s",
|
|
||||||
"1m",
|
|
||||||
"5m",
|
|
||||||
"15m",
|
|
||||||
"30m",
|
|
||||||
"1h",
|
|
||||||
"2h",
|
|
||||||
"1d"
|
|
||||||
],
|
|
||||||
"time_options": [
|
|
||||||
"5m",
|
|
||||||
"15m",
|
|
||||||
"1h",
|
|
||||||
"6h",
|
|
||||||
"12h",
|
|
||||||
"24h",
|
|
||||||
"2d",
|
|
||||||
"7d",
|
|
||||||
"30d"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"templating": {
|
|
||||||
"list": []
|
|
||||||
},
|
|
||||||
"annotations": {
|
"annotations": {
|
||||||
"list": []
|
"list": []
|
||||||
},
|
},
|
||||||
"refresh": false,
|
"editable": true,
|
||||||
"schemaVersion": 13,
|
|
||||||
"version": 4,
|
|
||||||
"links": [],
|
|
||||||
"gnetId": null,
|
"gnetId": null,
|
||||||
|
"graphTooltip": 0,
|
||||||
|
"hideControls": false,
|
||||||
|
"links": [],
|
||||||
|
"refresh": false,
|
||||||
|
"revision": 8,
|
||||||
"rows": [
|
"rows": [
|
||||||
{
|
{
|
||||||
"collapse": false,
|
"collapse": false,
|
||||||
"editable": true,
|
|
||||||
"height": "250px",
|
"height": "250px",
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
@ -63,7 +22,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 1,
|
"id": 1,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -137,7 +95,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 2,
|
"id": 2,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -211,7 +168,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 3,
|
"id": 3,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -278,17 +234,15 @@
|
|||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"title": "New row",
|
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"isNew": false,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
"repeatRowId": null,
|
"repeatRowId": null,
|
||||||
"repeatIteration": null
|
"showTitle": false,
|
||||||
|
"title": "New row",
|
||||||
|
"titleSize": "h6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapse": false,
|
"collapse": false,
|
||||||
"editable": true,
|
|
||||||
"height": "250px",
|
"height": "250px",
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
@ -299,7 +253,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 4,
|
"id": 4,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -370,7 +323,6 @@
|
|||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"id": 6,
|
"id": 6,
|
||||||
"isNew": true,
|
|
||||||
"links": [],
|
"links": [],
|
||||||
"mode": "markdown",
|
"mode": "markdown",
|
||||||
"span": 4,
|
"span": 4,
|
||||||
@ -378,17 +330,15 @@
|
|||||||
"type": "text"
|
"type": "text"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"title": "New row",
|
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"isNew": false,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
"repeatRowId": null,
|
"repeatRowId": null,
|
||||||
"repeatIteration": null
|
"showTitle": false,
|
||||||
|
"title": "New row",
|
||||||
|
"titleSize": "h6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapse": false,
|
"collapse": false,
|
||||||
"editable": true,
|
|
||||||
"height": 336,
|
"height": 336,
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
@ -399,7 +349,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 5,
|
"id": 5,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -481,7 +430,6 @@
|
|||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"id": 7,
|
"id": 7,
|
||||||
"isNew": true,
|
|
||||||
"links": [],
|
"links": [],
|
||||||
"mode": "markdown",
|
"mode": "markdown",
|
||||||
"span": 4,
|
"span": 4,
|
||||||
@ -489,17 +437,15 @@
|
|||||||
"type": "text"
|
"type": "text"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"title": "New row",
|
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"isNew": false,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
"repeatRowId": null,
|
"repeatRowId": null,
|
||||||
"repeatIteration": null
|
"showTitle": false,
|
||||||
|
"title": "New row",
|
||||||
|
"titleSize": "h6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapse": false,
|
"collapse": false,
|
||||||
"editable": true,
|
|
||||||
"height": "250px",
|
"height": "250px",
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
@ -510,7 +456,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 8,
|
"id": 8,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -584,7 +529,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 10,
|
"id": 10,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -655,7 +599,6 @@
|
|||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"id": 13,
|
"id": 13,
|
||||||
"isNew": true,
|
|
||||||
"links": [],
|
"links": [],
|
||||||
"mode": "markdown",
|
"mode": "markdown",
|
||||||
"span": 4,
|
"span": 4,
|
||||||
@ -663,17 +606,16 @@
|
|||||||
"type": "text"
|
"type": "text"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"title": "New row",
|
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"isNew": false,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
"repeatRowId": null,
|
"repeatRowId": null,
|
||||||
"repeatIteration": null
|
"showTitle": false,
|
||||||
|
"title": "New row",
|
||||||
|
"titleSize": "h6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"isNew": false,
|
"collapse": false,
|
||||||
"title": "Dashboard Row",
|
"height": 250,
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
"aliasColors": {},
|
"aliasColors": {},
|
||||||
@ -683,7 +625,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 9,
|
"id": 9,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -776,7 +717,6 @@
|
|||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"id": 14,
|
"id": 14,
|
||||||
"isNew": true,
|
|
||||||
"links": [],
|
"links": [],
|
||||||
"mode": "markdown",
|
"mode": "markdown",
|
||||||
"span": 4,
|
"span": 4,
|
||||||
@ -784,17 +724,16 @@
|
|||||||
"type": "text"
|
"type": "text"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"height": 250,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
"repeatRowId": null,
|
|
||||||
"repeatIteration": null,
|
"repeatIteration": null,
|
||||||
"collapse": false
|
"repeatRowId": null,
|
||||||
|
"showTitle": false,
|
||||||
|
"title": "Dashboard Row",
|
||||||
|
"titleSize": "h6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"isNew": false,
|
"collapse": false,
|
||||||
"title": "Dashboard Row",
|
"height": 250,
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
"aliasColors": {},
|
"aliasColors": {},
|
||||||
@ -804,7 +743,6 @@
|
|||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
"id": 12,
|
"id": 12,
|
||||||
"isNew": true,
|
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
"current": false,
|
"current": false,
|
||||||
@ -833,12 +771,12 @@
|
|||||||
"steppedLine": false,
|
"steppedLine": false,
|
||||||
"targets": [
|
"targets": [
|
||||||
{
|
{
|
||||||
|
"alias": "",
|
||||||
"hide": false,
|
"hide": false,
|
||||||
"refId": "B",
|
"refId": "B",
|
||||||
"scenarioId": "csv_metric_values",
|
"scenarioId": "csv_metric_values",
|
||||||
"stringInput": "1,20,40,null,null,null,null,null,null,100,10,10,20,30,40,10",
|
"stringInput": "1,20,40,null,null,null,null,null,null,100,10,10,20,30,40,10",
|
||||||
"target": "",
|
"target": ""
|
||||||
"alias": ""
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"alias": "",
|
"alias": "",
|
||||||
@ -898,7 +836,6 @@
|
|||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"id": 15,
|
"id": 15,
|
||||||
"isNew": true,
|
|
||||||
"links": [],
|
"links": [],
|
||||||
"mode": "markdown",
|
"mode": "markdown",
|
||||||
"span": 4,
|
"span": 4,
|
||||||
@ -906,13 +843,606 @@
|
|||||||
"type": "text"
|
"type": "text"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"showTitle": false,
|
|
||||||
"titleSize": "h6",
|
|
||||||
"height": 250,
|
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
"repeatRowId": null,
|
|
||||||
"repeatIteration": null,
|
"repeatIteration": null,
|
||||||
"collapse": false
|
"repeatRowId": null,
|
||||||
|
"showTitle": false,
|
||||||
|
"title": "Dashboard Row",
|
||||||
|
"titleSize": "h6"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"collapse": false,
|
||||||
|
"height": 250,
|
||||||
|
"panels": [
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"datasource": "Grafana TestData",
|
||||||
|
"decimals": 3,
|
||||||
|
"fill": 1,
|
||||||
|
"id": 20,
|
||||||
|
"legend": {
|
||||||
|
"alignAsTable": true,
|
||||||
|
"avg": true,
|
||||||
|
"current": true,
|
||||||
|
"max": true,
|
||||||
|
"min": true,
|
||||||
|
"show": true,
|
||||||
|
"total": true,
|
||||||
|
"values": true
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"links": [],
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 5,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"span": 12,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"refId": "A",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Legend Table Single Series Should Take Minium Height",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": true,
|
||||||
|
"sort": 0,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
|
"repeatRowId": null,
|
||||||
|
"showTitle": false,
|
||||||
|
"title": "Dashboard Row",
|
||||||
|
"titleSize": "h6"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"collapse": false,
|
||||||
|
"height": 250,
|
||||||
|
"panels": [
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"datasource": "Grafana TestData",
|
||||||
|
"decimals": 3,
|
||||||
|
"fill": 1,
|
||||||
|
"id": 16,
|
||||||
|
"legend": {
|
||||||
|
"alignAsTable": true,
|
||||||
|
"avg": true,
|
||||||
|
"current": true,
|
||||||
|
"max": true,
|
||||||
|
"min": true,
|
||||||
|
"show": true,
|
||||||
|
"total": true,
|
||||||
|
"values": true
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"links": [],
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 5,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"span": 6,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"refId": "A",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "B",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "C",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "D",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Legend Table No Scroll Visible",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": true,
|
||||||
|
"sort": 0,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"datasource": "Grafana TestData",
|
||||||
|
"decimals": 3,
|
||||||
|
"fill": 1,
|
||||||
|
"id": 17,
|
||||||
|
"legend": {
|
||||||
|
"alignAsTable": true,
|
||||||
|
"avg": true,
|
||||||
|
"current": true,
|
||||||
|
"max": true,
|
||||||
|
"min": true,
|
||||||
|
"show": true,
|
||||||
|
"total": true,
|
||||||
|
"values": true
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"links": [],
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 5,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"span": 6,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"refId": "A",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "B",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "C",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "D",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "E",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "F",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "G",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "H",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "I",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "J",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Legend Table Should Scroll",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": true,
|
||||||
|
"sort": 0,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
|
"repeatRowId": null,
|
||||||
|
"showTitle": false,
|
||||||
|
"title": "Dashboard Row",
|
||||||
|
"titleSize": "h6"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"collapse": false,
|
||||||
|
"height": 250,
|
||||||
|
"panels": [
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"datasource": "Grafana TestData",
|
||||||
|
"decimals": 3,
|
||||||
|
"fill": 1,
|
||||||
|
"id": 18,
|
||||||
|
"legend": {
|
||||||
|
"alignAsTable": true,
|
||||||
|
"avg": true,
|
||||||
|
"current": true,
|
||||||
|
"max": true,
|
||||||
|
"min": true,
|
||||||
|
"rightSide": true,
|
||||||
|
"show": true,
|
||||||
|
"total": true,
|
||||||
|
"values": true
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"links": [],
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 5,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"span": 6,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"refId": "A",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "B",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "C",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "D",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Legend Table No Scroll Visible",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": true,
|
||||||
|
"sort": 0,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"datasource": "Grafana TestData",
|
||||||
|
"decimals": 3,
|
||||||
|
"fill": 1,
|
||||||
|
"id": 19,
|
||||||
|
"legend": {
|
||||||
|
"alignAsTable": true,
|
||||||
|
"avg": true,
|
||||||
|
"current": true,
|
||||||
|
"max": true,
|
||||||
|
"min": true,
|
||||||
|
"rightSide": true,
|
||||||
|
"show": true,
|
||||||
|
"total": true,
|
||||||
|
"values": true
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"links": [],
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 5,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"span": 6,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"refId": "A",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "B",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "C",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "D",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "E",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "F",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "G",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "H",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "I",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "J",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "K",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"refId": "L",
|
||||||
|
"scenarioId": "csv_metric_values",
|
||||||
|
"stringInput": "1,20,90,30,5,0",
|
||||||
|
"target": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Legend Table No Scroll Visible",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": true,
|
||||||
|
"sort": 0,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"repeat": null,
|
||||||
|
"repeatIteration": null,
|
||||||
|
"repeatRowId": null,
|
||||||
|
"showTitle": false,
|
||||||
|
"title": "Dashboard Row",
|
||||||
|
"titleSize": "h6"
|
||||||
}
|
}
|
||||||
]
|
],
|
||||||
|
"schemaVersion": 14,
|
||||||
|
"style": "dark",
|
||||||
|
"tags": [
|
||||||
|
"grafana-test"
|
||||||
|
],
|
||||||
|
"templating": {
|
||||||
|
"list": []
|
||||||
|
},
|
||||||
|
"time": {
|
||||||
|
"from": "now-1h",
|
||||||
|
"to": "now"
|
||||||
|
},
|
||||||
|
"timepicker": {
|
||||||
|
"refresh_intervals": [
|
||||||
|
"5s",
|
||||||
|
"10s",
|
||||||
|
"30s",
|
||||||
|
"1m",
|
||||||
|
"5m",
|
||||||
|
"15m",
|
||||||
|
"30m",
|
||||||
|
"1h",
|
||||||
|
"2h",
|
||||||
|
"1d"
|
||||||
|
],
|
||||||
|
"time_options": [
|
||||||
|
"5m",
|
||||||
|
"15m",
|
||||||
|
"1h",
|
||||||
|
"6h",
|
||||||
|
"12h",
|
||||||
|
"24h",
|
||||||
|
"2d",
|
||||||
|
"7d",
|
||||||
|
"30d"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"timezone": "browser",
|
||||||
|
"title": "TestData - Graph Panel Last 1h",
|
||||||
|
"version": 2
|
||||||
}
|
}
|
||||||
|
2
public/app/plugins/app/testdata/plugin.json
vendored
2
public/app/plugins/app/testdata/plugin.json
vendored
@ -9,7 +9,7 @@
|
|||||||
"name": "Grafana Project",
|
"name": "Grafana Project",
|
||||||
"url": "http://grafana.org"
|
"url": "http://grafana.org"
|
||||||
},
|
},
|
||||||
"version": "1.0.15",
|
"version": "1.0.17",
|
||||||
"updated": "2016-09-26"
|
"updated": "2016-09-26"
|
||||||
},
|
},
|
||||||
|
|
||||||
|
@ -197,15 +197,9 @@ function (angular, _, moment, kbn, ElasticQueryBuilder, IndexPattern, ElasticRes
|
|||||||
target = options.targets[i];
|
target = options.targets[i];
|
||||||
if (target.hide) {continue;}
|
if (target.hide) {continue;}
|
||||||
|
|
||||||
var queryObj = this.queryBuilder.build(target, adhocFilters);
|
var queryString = templateSrv.replace(target.query || '*', options.scopedVars, 'lucene');
|
||||||
|
var queryObj = this.queryBuilder.build(target, adhocFilters, queryString);
|
||||||
var esQuery = angular.toJson(queryObj);
|
var esQuery = angular.toJson(queryObj);
|
||||||
var luceneQuery = target.query || '*';
|
|
||||||
luceneQuery = templateSrv.replace(luceneQuery, options.scopedVars, 'lucene');
|
|
||||||
luceneQuery = angular.toJson(luceneQuery);
|
|
||||||
|
|
||||||
// remove inner quotes
|
|
||||||
luceneQuery = luceneQuery.substr(1, luceneQuery.length - 2);
|
|
||||||
esQuery = esQuery.replace("$lucene_query", luceneQuery);
|
|
||||||
|
|
||||||
var searchType = (queryObj.size === 0 && this.esVersion < 5) ? 'count' : 'query_then_fetch';
|
var searchType = (queryObj.size === 0 && this.esVersion < 5) ? 'count' : 'query_then_fetch';
|
||||||
var header = this.getQueryHeader(searchType, options.range.from, options.range.to);
|
var header = this.getQueryHeader(searchType, options.range.from, options.range.to);
|
||||||
@ -219,7 +213,6 @@ function (angular, _, moment, kbn, ElasticQueryBuilder, IndexPattern, ElasticRes
|
|||||||
return $q.when([]);
|
return $q.when([]);
|
||||||
}
|
}
|
||||||
|
|
||||||
payload = payload.replace(/\$interval/g, options.interval);
|
|
||||||
payload = payload.replace(/\$timeFrom/g, options.range.from.valueOf());
|
payload = payload.replace(/\$timeFrom/g, options.range.from.valueOf());
|
||||||
payload = payload.replace(/\$timeTo/g, options.range.to.valueOf());
|
payload = payload.replace(/\$timeTo/g, options.range.to.valueOf());
|
||||||
payload = templateSrv.replace(payload, options.scopedVars);
|
payload = templateSrv.replace(payload, options.scopedVars);
|
||||||
|
@ -66,7 +66,7 @@ function (queryDef) {
|
|||||||
esAgg.format = "epoch_millis";
|
esAgg.format = "epoch_millis";
|
||||||
|
|
||||||
if (esAgg.interval === 'auto') {
|
if (esAgg.interval === 'auto') {
|
||||||
esAgg.interval = "$interval";
|
esAgg.interval = "$__interval";
|
||||||
}
|
}
|
||||||
|
|
||||||
if (settings.missing) {
|
if (settings.missing) {
|
||||||
@ -121,7 +121,7 @@ function (queryDef) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
ElasticQueryBuilder.prototype.build = function(target, adhocFilters) {
|
ElasticQueryBuilder.prototype.build = function(target, adhocFilters, queryString) {
|
||||||
// make sure query has defaults;
|
// make sure query has defaults;
|
||||||
target.metrics = target.metrics || [{ type: 'count', id: '1' }];
|
target.metrics = target.metrics || [{ type: 'count', id: '1' }];
|
||||||
target.dsType = 'elasticsearch';
|
target.dsType = 'elasticsearch';
|
||||||
@ -138,7 +138,7 @@ function (queryDef) {
|
|||||||
{
|
{
|
||||||
"query_string": {
|
"query_string": {
|
||||||
"analyze_wildcard": true,
|
"analyze_wildcard": true,
|
||||||
"query": '$lucene_query'
|
"query": queryString,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
@ -45,7 +45,7 @@ export default class InfluxDatasource {
|
|||||||
|
|
||||||
query(options) {
|
query(options) {
|
||||||
var timeFilter = this.getTimeFilter(options);
|
var timeFilter = this.getTimeFilter(options);
|
||||||
var scopedVars = options.scopedVars ? _.cloneDeep(options.scopedVars) : {};
|
var scopedVars = options.scopedVars;
|
||||||
var targets = _.cloneDeep(options.targets);
|
var targets = _.cloneDeep(options.targets);
|
||||||
var queryTargets = [];
|
var queryTargets = [];
|
||||||
var queryModel;
|
var queryModel;
|
||||||
@ -56,8 +56,8 @@ export default class InfluxDatasource {
|
|||||||
|
|
||||||
queryTargets.push(target);
|
queryTargets.push(target);
|
||||||
|
|
||||||
// build query
|
// backward compatability
|
||||||
scopedVars.interval = {value: target.interval || options.interval};
|
scopedVars.interval = scopedVars.__interval;
|
||||||
|
|
||||||
queryModel = new InfluxQuery(target, this.templateSrv, scopedVars);
|
queryModel = new InfluxQuery(target, this.templateSrv, scopedVars);
|
||||||
return queryModel.render(true);
|
return queryModel.render(true);
|
||||||
|
@ -23,7 +23,7 @@ export default class InfluxQuery {
|
|||||||
target.resultFormat = target.resultFormat || 'time_series';
|
target.resultFormat = target.resultFormat || 'time_series';
|
||||||
target.tags = target.tags || [];
|
target.tags = target.tags || [];
|
||||||
target.groupBy = target.groupBy || [
|
target.groupBy = target.groupBy || [
|
||||||
{type: 'time', params: ['$interval']},
|
{type: 'time', params: ['$__interval']},
|
||||||
{type: 'fill', params: ['null']},
|
{type: 'fill', params: ['null']},
|
||||||
];
|
];
|
||||||
target.select = target.select || [[
|
target.select = target.select || [[
|
||||||
|
@ -12,7 +12,7 @@ describe('InfluxQuery', function() {
|
|||||||
}, templateSrv, {});
|
}, templateSrv, {});
|
||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE $timeFilter GROUP BY time($interval) fill(null)');
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE $timeFilter GROUP BY time($__interval) fill(null)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -24,7 +24,7 @@ describe('InfluxQuery', function() {
|
|||||||
}, templateSrv, {});
|
}, templateSrv, {});
|
||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "5m_avg"."cpu" WHERE $timeFilter GROUP BY time($interval) fill(null)');
|
expect(queryText).to.be('SELECT mean("value") FROM "5m_avg"."cpu" WHERE $timeFilter GROUP BY time($__interval) fill(null)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -43,7 +43,7 @@ describe('InfluxQuery', function() {
|
|||||||
}, templateSrv, {});
|
}, templateSrv, {});
|
||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") /100 AS "text" FROM "cpu" WHERE $timeFilter GROUP BY time($interval) fill(null)');
|
expect(queryText).to.be('SELECT mean("value") /100 AS "text" FROM "cpu" WHERE $timeFilter GROUP BY time($__interval) fill(null)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -58,7 +58,7 @@ describe('InfluxQuery', function() {
|
|||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
|
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server\\\\1\' AND $timeFilter'
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server\\\\1\' AND $timeFilter'
|
||||||
+ ' GROUP BY time($interval)');
|
+ ' GROUP BY time($__interval)');
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should switch regex operator with tag value is regex', function() {
|
it('should switch regex operator with tag value is regex', function() {
|
||||||
@ -69,7 +69,7 @@ describe('InfluxQuery', function() {
|
|||||||
}, templateSrv, {});
|
}, templateSrv, {});
|
||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "app" =~ /e.*/ AND $timeFilter GROUP BY time($interval)');
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "app" =~ /e.*/ AND $timeFilter GROUP BY time($__interval)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -83,7 +83,7 @@ describe('InfluxQuery', function() {
|
|||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server1\' AND "app" = \'email\' AND ' +
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server1\' AND "app" = \'email\' AND ' +
|
||||||
'$timeFilter GROUP BY time($interval)');
|
'$timeFilter GROUP BY time($__interval)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -97,7 +97,7 @@ describe('InfluxQuery', function() {
|
|||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server1\' OR "hostname" = \'server2\' AND ' +
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE "hostname" = \'server1\' OR "hostname" = \'server2\' AND ' +
|
||||||
'$timeFilter GROUP BY time($interval)');
|
'$timeFilter GROUP BY time($__interval)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -124,7 +124,7 @@ describe('InfluxQuery', function() {
|
|||||||
|
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE $timeFilter ' +
|
expect(queryText).to.be('SELECT mean("value") FROM "cpu" WHERE $timeFilter ' +
|
||||||
'GROUP BY time($interval), "host"');
|
'GROUP BY time($__interval), "host"');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -148,7 +148,7 @@ describe('InfluxQuery', function() {
|
|||||||
groupBy: [{type: 'time'}, {type: 'fill', params: ['0']}],
|
groupBy: [{type: 'time'}, {type: 'fill', params: ['0']}],
|
||||||
}, templateSrv, {});
|
}, templateSrv, {});
|
||||||
var queryText = query.render();
|
var queryText = query.render();
|
||||||
expect(queryText).to.be('SELECT "value" FROM "cpu" WHERE $timeFilter GROUP BY time($interval) fill(0)');
|
expect(queryText).to.be('SELECT "value" FROM "cpu" WHERE $timeFilter GROUP BY time($__interval) fill(0)');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -124,6 +124,7 @@ function (angular, _, $) {
|
|||||||
|
|
||||||
$container.toggleClass('graph-legend-table', panel.legend.alignAsTable === true);
|
$container.toggleClass('graph-legend-table', panel.legend.alignAsTable === true);
|
||||||
|
|
||||||
|
var tableHeaderElem;
|
||||||
if (panel.legend.alignAsTable) {
|
if (panel.legend.alignAsTable) {
|
||||||
var header = '<tr>';
|
var header = '<tr>';
|
||||||
header += '<th colspan="2" style="text-align:left"></th>';
|
header += '<th colspan="2" style="text-align:left"></th>';
|
||||||
@ -135,7 +136,7 @@ function (angular, _, $) {
|
|||||||
header += getTableHeaderHtml('total');
|
header += getTableHeaderHtml('total');
|
||||||
}
|
}
|
||||||
header += '</tr>';
|
header += '</tr>';
|
||||||
$container.append($(header));
|
tableHeaderElem = $(header);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (panel.legend.sort) {
|
if (panel.legend.sort) {
|
||||||
@ -148,6 +149,8 @@ function (angular, _, $) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var seriesShown = 0;
|
var seriesShown = 0;
|
||||||
|
var seriesElements = [];
|
||||||
|
|
||||||
for (i = 0; i < seriesList.length; i++) {
|
for (i = 0; i < seriesList.length; i++) {
|
||||||
var series = seriesList[i];
|
var series = seriesList[i];
|
||||||
|
|
||||||
@ -156,6 +159,7 @@ function (angular, _, $) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var html = '<div class="graph-legend-series';
|
var html = '<div class="graph-legend-series';
|
||||||
|
|
||||||
if (series.yaxis === 2) { html += ' graph-legend-series--right-y'; }
|
if (series.yaxis === 2) { html += ' graph-legend-series--right-y'; }
|
||||||
if (ctrl.hiddenSeries[series.alias]) { html += ' graph-legend-series-hidden'; }
|
if (ctrl.hiddenSeries[series.alias]) { html += ' graph-legend-series-hidden'; }
|
||||||
html += '" data-series-index="' + i + '">';
|
html += '" data-series-index="' + i + '">';
|
||||||
@ -180,7 +184,7 @@ function (angular, _, $) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
html += '</div>';
|
html += '</div>';
|
||||||
$container.append($(html));
|
seriesElements.push($(html));
|
||||||
|
|
||||||
seriesShown++;
|
seriesShown++;
|
||||||
}
|
}
|
||||||
@ -193,9 +197,13 @@ function (angular, _, $) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var topPadding = 6;
|
var topPadding = 6;
|
||||||
$container.css("max-height", maxHeight - topPadding);
|
var tbodyElem = $('<tbody></tbody>');
|
||||||
|
tbodyElem.css("max-height", maxHeight - topPadding);
|
||||||
|
tbodyElem.append(tableHeaderElem);
|
||||||
|
tbodyElem.append(seriesElements);
|
||||||
|
$container.append(tbodyElem);
|
||||||
} else {
|
} else {
|
||||||
$container.css("max-height", "");
|
$container.append(seriesElements);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -6,13 +6,6 @@
|
|||||||
<select class="gf-form-input" ng-model="ctrl.panel.mode" ng-options="f for f in ['html','markdown','text']"></select>
|
<select class="gf-form-input" ng-model="ctrl.panel.mode" ng-options="f for f in ['html','markdown','text']"></select>
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="gf-form" ng-show="ctrl.panel.mode == 'text'">
|
|
||||||
<span class="gf-form-label">Font Size</span>
|
|
||||||
<span class="gf-form-select-wrapper">
|
|
||||||
<select class="gf-form-input" ng-model="ctrl.panel.style['font-size']" ng-options="f for f in ['6pt','7pt','8pt','10pt','12pt','14pt','16pt','18pt','20pt','24pt','28pt','32pt','36pt','42pt','48pt','52pt','60pt','72pt']"></select>
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
@ -28,6 +28,10 @@ export class TextPanelCtrl extends PanelCtrl {
|
|||||||
onInitEditMode() {
|
onInitEditMode() {
|
||||||
this.addEditorTab('Options', 'public/app/plugins/panel/text/editor.html');
|
this.addEditorTab('Options', 'public/app/plugins/panel/text/editor.html');
|
||||||
this.editorTabIndex = 1;
|
this.editorTabIndex = 1;
|
||||||
|
|
||||||
|
if (this.panel.mode === 'text') {
|
||||||
|
this.panel.mode = 'markdown';
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
onRefresh() {
|
onRefresh() {
|
||||||
@ -39,8 +43,6 @@ export class TextPanelCtrl extends PanelCtrl {
|
|||||||
this.renderMarkdown(this.panel.content);
|
this.renderMarkdown(this.panel.content);
|
||||||
} else if (this.panel.mode === 'html') {
|
} else if (this.panel.mode === 'html') {
|
||||||
this.updateContent(this.panel.content);
|
this.updateContent(this.panel.content);
|
||||||
} else if (this.panel.mode === 'text') {
|
|
||||||
this.renderText(this.panel.content);
|
|
||||||
}
|
}
|
||||||
this.renderingCompleted();
|
this.renderingCompleted();
|
||||||
}
|
}
|
||||||
|
@ -85,9 +85,11 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.graph-legend-table {
|
.graph-legend-table {
|
||||||
overflow-y: auto;
|
tbody {
|
||||||
overflow-x: hidden;
|
display: block;
|
||||||
display: table;
|
overflow-y: auto;
|
||||||
|
overflow-x: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
.graph-legend-series {
|
.graph-legend-series {
|
||||||
display: table-row;
|
display: table-row;
|
||||||
|
14368
vendor/github.com/aws/aws-sdk-go/service/s3/api.go
generated
vendored
Normal file
14368
vendor/github.com/aws/aws-sdk-go/service/s3/api.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
43
vendor/github.com/aws/aws-sdk-go/service/s3/bucket_location.go
generated
vendored
Normal file
43
vendor/github.com/aws/aws-sdk-go/service/s3/bucket_location.go
generated
vendored
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io/ioutil"
|
||||||
|
"regexp"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awsutil"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
var reBucketLocation = regexp.MustCompile(`>([^<>]+)<\/Location`)
|
||||||
|
|
||||||
|
func buildGetBucketLocation(r *request.Request) {
|
||||||
|
if r.DataFilled() {
|
||||||
|
out := r.Data.(*GetBucketLocationOutput)
|
||||||
|
b, err := ioutil.ReadAll(r.HTTPResponse.Body)
|
||||||
|
if err != nil {
|
||||||
|
r.Error = awserr.New("SerializationError", "failed reading response body", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
match := reBucketLocation.FindSubmatch(b)
|
||||||
|
if len(match) > 1 {
|
||||||
|
loc := string(match[1])
|
||||||
|
out.LocationConstraint = &loc
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func populateLocationConstraint(r *request.Request) {
|
||||||
|
if r.ParamsFilled() && aws.StringValue(r.Config.Region) != "us-east-1" {
|
||||||
|
in := r.Params.(*CreateBucketInput)
|
||||||
|
if in.CreateBucketConfiguration == nil {
|
||||||
|
r.Params = awsutil.CopyOf(r.Params)
|
||||||
|
in = r.Params.(*CreateBucketInput)
|
||||||
|
in.CreateBucketConfiguration = &CreateBucketConfiguration{
|
||||||
|
LocationConstraint: r.Config.Region,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
36
vendor/github.com/aws/aws-sdk-go/service/s3/content_md5.go
generated
vendored
Normal file
36
vendor/github.com/aws/aws-sdk-go/service/s3/content_md5.go
generated
vendored
Normal file
@ -0,0 +1,36 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/md5"
|
||||||
|
"encoding/base64"
|
||||||
|
"io"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
// contentMD5 computes and sets the HTTP Content-MD5 header for requests that
|
||||||
|
// require it.
|
||||||
|
func contentMD5(r *request.Request) {
|
||||||
|
h := md5.New()
|
||||||
|
|
||||||
|
// hash the body. seek back to the first position after reading to reset
|
||||||
|
// the body for transmission. copy errors may be assumed to be from the
|
||||||
|
// body.
|
||||||
|
_, err := io.Copy(h, r.Body)
|
||||||
|
if err != nil {
|
||||||
|
r.Error = awserr.New("ContentMD5", "failed to read body", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
_, err = r.Body.Seek(0, 0)
|
||||||
|
if err != nil {
|
||||||
|
r.Error = awserr.New("ContentMD5", "failed to seek body", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// encode the md5 checksum in base64 and set the request header.
|
||||||
|
sum := h.Sum(nil)
|
||||||
|
sum64 := make([]byte, base64.StdEncoding.EncodedLen(len(sum)))
|
||||||
|
base64.StdEncoding.Encode(sum64, sum)
|
||||||
|
r.HTTPRequest.Header.Set("Content-MD5", string(sum64))
|
||||||
|
}
|
46
vendor/github.com/aws/aws-sdk-go/service/s3/customizations.go
generated
vendored
Normal file
46
vendor/github.com/aws/aws-sdk-go/service/s3/customizations.go
generated
vendored
Normal file
@ -0,0 +1,46 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/aws/aws-sdk-go/aws/client"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
initClient = defaultInitClientFn
|
||||||
|
initRequest = defaultInitRequestFn
|
||||||
|
}
|
||||||
|
|
||||||
|
func defaultInitClientFn(c *client.Client) {
|
||||||
|
// Support building custom endpoints based on config
|
||||||
|
c.Handlers.Build.PushFront(updateEndpointForS3Config)
|
||||||
|
|
||||||
|
// Require SSL when using SSE keys
|
||||||
|
c.Handlers.Validate.PushBack(validateSSERequiresSSL)
|
||||||
|
c.Handlers.Build.PushBack(computeSSEKeys)
|
||||||
|
|
||||||
|
// S3 uses custom error unmarshaling logic
|
||||||
|
c.Handlers.UnmarshalError.Clear()
|
||||||
|
c.Handlers.UnmarshalError.PushBack(unmarshalError)
|
||||||
|
}
|
||||||
|
|
||||||
|
func defaultInitRequestFn(r *request.Request) {
|
||||||
|
// Add reuest handlers for specific platforms.
|
||||||
|
// e.g. 100-continue support for PUT requests using Go 1.6
|
||||||
|
platformRequestHandlers(r)
|
||||||
|
|
||||||
|
switch r.Operation.Name {
|
||||||
|
case opPutBucketCors, opPutBucketLifecycle, opPutBucketPolicy,
|
||||||
|
opPutBucketTagging, opDeleteObjects, opPutBucketLifecycleConfiguration,
|
||||||
|
opPutBucketReplication:
|
||||||
|
// These S3 operations require Content-MD5 to be set
|
||||||
|
r.Handlers.Build.PushBack(contentMD5)
|
||||||
|
case opGetBucketLocation:
|
||||||
|
// GetBucketLocation has custom parsing logic
|
||||||
|
r.Handlers.Unmarshal.PushFront(buildGetBucketLocation)
|
||||||
|
case opCreateBucket:
|
||||||
|
// Auto-populate LocationConstraint with current region
|
||||||
|
r.Handlers.Validate.PushFront(populateLocationConstraint)
|
||||||
|
case opCopyObject, opUploadPartCopy, opCompleteMultipartUpload:
|
||||||
|
r.Handlers.Unmarshal.PushFront(copyMultipartStatusOKUnmarhsalError)
|
||||||
|
}
|
||||||
|
}
|
186
vendor/github.com/aws/aws-sdk-go/service/s3/host_style_bucket.go
generated
vendored
Normal file
186
vendor/github.com/aws/aws-sdk-go/service/s3/host_style_bucket.go
generated
vendored
Normal file
@ -0,0 +1,186 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"net/url"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awsutil"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
// an operationBlacklist is a list of operation names that should a
|
||||||
|
// request handler should not be executed with.
|
||||||
|
type operationBlacklist []string
|
||||||
|
|
||||||
|
// Continue will return true of the Request's operation name is not
|
||||||
|
// in the blacklist. False otherwise.
|
||||||
|
func (b operationBlacklist) Continue(r *request.Request) bool {
|
||||||
|
for i := 0; i < len(b); i++ {
|
||||||
|
if b[i] == r.Operation.Name {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
var accelerateOpBlacklist = operationBlacklist{
|
||||||
|
opListBuckets, opCreateBucket, opDeleteBucket,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Request handler to automatically add the bucket name to the endpoint domain
|
||||||
|
// if possible. This style of bucket is valid for all bucket names which are
|
||||||
|
// DNS compatible and do not contain "."
|
||||||
|
func updateEndpointForS3Config(r *request.Request) {
|
||||||
|
forceHostStyle := aws.BoolValue(r.Config.S3ForcePathStyle)
|
||||||
|
accelerate := aws.BoolValue(r.Config.S3UseAccelerate)
|
||||||
|
|
||||||
|
if accelerate && accelerateOpBlacklist.Continue(r) {
|
||||||
|
if forceHostStyle {
|
||||||
|
if r.Config.Logger != nil {
|
||||||
|
r.Config.Logger.Log("ERROR: aws.Config.S3UseAccelerate is not compatible with aws.Config.S3ForcePathStyle, ignoring S3ForcePathStyle.")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
updateEndpointForAccelerate(r)
|
||||||
|
} else if !forceHostStyle && r.Operation.Name != opGetBucketLocation {
|
||||||
|
updateEndpointForHostStyle(r)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func updateEndpointForHostStyle(r *request.Request) {
|
||||||
|
bucket, ok := bucketNameFromReqParams(r.Params)
|
||||||
|
if !ok {
|
||||||
|
// Ignore operation requests if the bucketname was not provided
|
||||||
|
// if this is an input validation error the validation handler
|
||||||
|
// will report it.
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if !hostCompatibleBucketName(r.HTTPRequest.URL, bucket) {
|
||||||
|
// bucket name must be valid to put into the host
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
moveBucketToHost(r.HTTPRequest.URL, bucket)
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
accelElem = []byte("s3-accelerate.dualstack.")
|
||||||
|
)
|
||||||
|
|
||||||
|
func updateEndpointForAccelerate(r *request.Request) {
|
||||||
|
bucket, ok := bucketNameFromReqParams(r.Params)
|
||||||
|
if !ok {
|
||||||
|
// Ignore operation requests if the bucketname was not provided
|
||||||
|
// if this is an input validation error the validation handler
|
||||||
|
// will report it.
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if !hostCompatibleBucketName(r.HTTPRequest.URL, bucket) {
|
||||||
|
r.Error = awserr.New("InvalidParameterException",
|
||||||
|
fmt.Sprintf("bucket name %s is not compatibile with S3 Accelerate", bucket),
|
||||||
|
nil)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Change endpoint from s3(-[a-z0-1-])?.amazonaws.com to s3-accelerate.amazonaws.com
|
||||||
|
r.HTTPRequest.URL.Host = replaceHostRegion(r.HTTPRequest.URL.Host, "accelerate")
|
||||||
|
|
||||||
|
if aws.BoolValue(r.Config.UseDualStack) {
|
||||||
|
host := []byte(r.HTTPRequest.URL.Host)
|
||||||
|
|
||||||
|
// Strip region from hostname
|
||||||
|
if idx := bytes.Index(host, accelElem); idx >= 0 {
|
||||||
|
start := idx + len(accelElem)
|
||||||
|
if end := bytes.IndexByte(host[start:], '.'); end >= 0 {
|
||||||
|
end += start + 1
|
||||||
|
copy(host[start:], host[end:])
|
||||||
|
host = host[:len(host)-(end-start)]
|
||||||
|
r.HTTPRequest.URL.Host = string(host)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
moveBucketToHost(r.HTTPRequest.URL, bucket)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Attempts to retrieve the bucket name from the request input parameters.
|
||||||
|
// If no bucket is found, or the field is empty "", false will be returned.
|
||||||
|
func bucketNameFromReqParams(params interface{}) (string, bool) {
|
||||||
|
b, _ := awsutil.ValuesAtPath(params, "Bucket")
|
||||||
|
if len(b) == 0 {
|
||||||
|
return "", false
|
||||||
|
}
|
||||||
|
|
||||||
|
if bucket, ok := b[0].(*string); ok {
|
||||||
|
if bucketStr := aws.StringValue(bucket); bucketStr != "" {
|
||||||
|
return bucketStr, true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return "", false
|
||||||
|
}
|
||||||
|
|
||||||
|
// hostCompatibleBucketName returns true if the request should
|
||||||
|
// put the bucket in the host. This is false if S3ForcePathStyle is
|
||||||
|
// explicitly set or if the bucket is not DNS compatible.
|
||||||
|
func hostCompatibleBucketName(u *url.URL, bucket string) bool {
|
||||||
|
// Bucket might be DNS compatible but dots in the hostname will fail
|
||||||
|
// certificate validation, so do not use host-style.
|
||||||
|
if u.Scheme == "https" && strings.Contains(bucket, ".") {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// if the bucket is DNS compatible
|
||||||
|
return dnsCompatibleBucketName(bucket)
|
||||||
|
}
|
||||||
|
|
||||||
|
var reDomain = regexp.MustCompile(`^[a-z0-9][a-z0-9\.\-]{1,61}[a-z0-9]$`)
|
||||||
|
var reIPAddress = regexp.MustCompile(`^(\d+\.){3}\d+$`)
|
||||||
|
|
||||||
|
// dnsCompatibleBucketName returns true if the bucket name is DNS compatible.
|
||||||
|
// Buckets created outside of the classic region MUST be DNS compatible.
|
||||||
|
func dnsCompatibleBucketName(bucket string) bool {
|
||||||
|
return reDomain.MatchString(bucket) &&
|
||||||
|
!reIPAddress.MatchString(bucket) &&
|
||||||
|
!strings.Contains(bucket, "..")
|
||||||
|
}
|
||||||
|
|
||||||
|
// moveBucketToHost moves the bucket name from the URI path to URL host.
|
||||||
|
func moveBucketToHost(u *url.URL, bucket string) {
|
||||||
|
u.Host = bucket + "." + u.Host
|
||||||
|
u.Path = strings.Replace(u.Path, "/{Bucket}", "", -1)
|
||||||
|
if u.Path == "" {
|
||||||
|
u.Path = "/"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const s3HostPrefix = "s3"
|
||||||
|
|
||||||
|
// replaceHostRegion replaces the S3 region string in the host with the
|
||||||
|
// value provided. If v is empty the host prefix returned will be s3.
|
||||||
|
func replaceHostRegion(host, v string) string {
|
||||||
|
if !strings.HasPrefix(host, s3HostPrefix) {
|
||||||
|
return host
|
||||||
|
}
|
||||||
|
|
||||||
|
suffix := host[len(s3HostPrefix):]
|
||||||
|
for i := len(s3HostPrefix); i < len(host); i++ {
|
||||||
|
if host[i] == '.' {
|
||||||
|
// Trim until '.' leave the it in place.
|
||||||
|
suffix = host[i:]
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(v) == 0 {
|
||||||
|
return fmt.Sprintf("s3%s", suffix)
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprintf("s3-%s%s", v, suffix)
|
||||||
|
}
|
8
vendor/github.com/aws/aws-sdk-go/service/s3/platform_handlers.go
generated
vendored
Normal file
8
vendor/github.com/aws/aws-sdk-go/service/s3/platform_handlers.go
generated
vendored
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
// +build !go1.6
|
||||||
|
|
||||||
|
package s3
|
||||||
|
|
||||||
|
import "github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
|
||||||
|
func platformRequestHandlers(r *request.Request) {
|
||||||
|
}
|
28
vendor/github.com/aws/aws-sdk-go/service/s3/platform_handlers_go1.6.go
generated
vendored
Normal file
28
vendor/github.com/aws/aws-sdk-go/service/s3/platform_handlers_go1.6.go
generated
vendored
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
// +build go1.6
|
||||||
|
|
||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
func platformRequestHandlers(r *request.Request) {
|
||||||
|
if r.Operation.HTTPMethod == "PUT" {
|
||||||
|
// 100-Continue should only be used on put requests.
|
||||||
|
r.Handlers.Sign.PushBack(add100Continue)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func add100Continue(r *request.Request) {
|
||||||
|
if aws.BoolValue(r.Config.S3Disable100Continue) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if r.HTTPRequest.ContentLength < 1024*1024*2 {
|
||||||
|
// Ignore requests smaller than 2MB. This helps prevent delaying
|
||||||
|
// requests unnecessarily.
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
r.HTTPRequest.Header.Set("Expect", "100-Continue")
|
||||||
|
}
|
86
vendor/github.com/aws/aws-sdk-go/service/s3/service.go
generated
vendored
Normal file
86
vendor/github.com/aws/aws-sdk-go/service/s3/service.go
generated
vendored
Normal file
@ -0,0 +1,86 @@
|
|||||||
|
// THIS FILE IS AUTOMATICALLY GENERATED. DO NOT EDIT.
|
||||||
|
|
||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/client"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/client/metadata"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/signer/v4"
|
||||||
|
"github.com/aws/aws-sdk-go/private/protocol/restxml"
|
||||||
|
)
|
||||||
|
|
||||||
|
// S3 is a client for Amazon S3.
|
||||||
|
//The service client's operations are safe to be used concurrently.
|
||||||
|
// It is not safe to mutate any of the client's properties though.
|
||||||
|
type S3 struct {
|
||||||
|
*client.Client
|
||||||
|
}
|
||||||
|
|
||||||
|
// Used for custom client initialization logic
|
||||||
|
var initClient func(*client.Client)
|
||||||
|
|
||||||
|
// Used for custom request initialization logic
|
||||||
|
var initRequest func(*request.Request)
|
||||||
|
|
||||||
|
// A ServiceName is the name of the service the client will make API calls to.
|
||||||
|
const ServiceName = "s3"
|
||||||
|
|
||||||
|
// New creates a new instance of the S3 client with a session.
|
||||||
|
// If additional configuration is needed for the client instance use the optional
|
||||||
|
// aws.Config parameter to add your extra config.
|
||||||
|
//
|
||||||
|
// Example:
|
||||||
|
// // Create a S3 client from just a session.
|
||||||
|
// svc := s3.New(mySession)
|
||||||
|
//
|
||||||
|
// // Create a S3 client with additional configuration
|
||||||
|
// svc := s3.New(mySession, aws.NewConfig().WithRegion("us-west-2"))
|
||||||
|
func New(p client.ConfigProvider, cfgs ...*aws.Config) *S3 {
|
||||||
|
c := p.ClientConfig(ServiceName, cfgs...)
|
||||||
|
return newClient(*c.Config, c.Handlers, c.Endpoint, c.SigningRegion)
|
||||||
|
}
|
||||||
|
|
||||||
|
// newClient creates, initializes and returns a new service client instance.
|
||||||
|
func newClient(cfg aws.Config, handlers request.Handlers, endpoint, signingRegion string) *S3 {
|
||||||
|
svc := &S3{
|
||||||
|
Client: client.New(
|
||||||
|
cfg,
|
||||||
|
metadata.ClientInfo{
|
||||||
|
ServiceName: ServiceName,
|
||||||
|
SigningRegion: signingRegion,
|
||||||
|
Endpoint: endpoint,
|
||||||
|
APIVersion: "2006-03-01",
|
||||||
|
},
|
||||||
|
handlers,
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handlers
|
||||||
|
svc.Handlers.Sign.PushBackNamed(v4.SignRequestHandler)
|
||||||
|
svc.Handlers.Build.PushBackNamed(restxml.BuildHandler)
|
||||||
|
svc.Handlers.Unmarshal.PushBackNamed(restxml.UnmarshalHandler)
|
||||||
|
svc.Handlers.UnmarshalMeta.PushBackNamed(restxml.UnmarshalMetaHandler)
|
||||||
|
svc.Handlers.UnmarshalError.PushBackNamed(restxml.UnmarshalErrorHandler)
|
||||||
|
|
||||||
|
// Run custom client initialization if present
|
||||||
|
if initClient != nil {
|
||||||
|
initClient(svc.Client)
|
||||||
|
}
|
||||||
|
|
||||||
|
return svc
|
||||||
|
}
|
||||||
|
|
||||||
|
// newRequest creates a new request for a S3 operation and runs any
|
||||||
|
// custom request initialization.
|
||||||
|
func (c *S3) newRequest(op *request.Operation, params, data interface{}) *request.Request {
|
||||||
|
req := c.NewRequest(op, params, data)
|
||||||
|
|
||||||
|
// Run custom request initialization if present
|
||||||
|
if initRequest != nil {
|
||||||
|
initRequest(req)
|
||||||
|
}
|
||||||
|
|
||||||
|
return req
|
||||||
|
}
|
44
vendor/github.com/aws/aws-sdk-go/service/s3/sse.go
generated
vendored
Normal file
44
vendor/github.com/aws/aws-sdk-go/service/s3/sse.go
generated
vendored
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/md5"
|
||||||
|
"encoding/base64"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awsutil"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
var errSSERequiresSSL = awserr.New("ConfigError", "cannot send SSE keys over HTTP.", nil)
|
||||||
|
|
||||||
|
func validateSSERequiresSSL(r *request.Request) {
|
||||||
|
if r.HTTPRequest.URL.Scheme != "https" {
|
||||||
|
p, _ := awsutil.ValuesAtPath(r.Params, "SSECustomerKey||CopySourceSSECustomerKey")
|
||||||
|
if len(p) > 0 {
|
||||||
|
r.Error = errSSERequiresSSL
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func computeSSEKeys(r *request.Request) {
|
||||||
|
headers := []string{
|
||||||
|
"x-amz-server-side-encryption-customer-key",
|
||||||
|
"x-amz-copy-source-server-side-encryption-customer-key",
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, h := range headers {
|
||||||
|
md5h := h + "-md5"
|
||||||
|
if key := r.HTTPRequest.Header.Get(h); key != "" {
|
||||||
|
// Base64-encode the value
|
||||||
|
b64v := base64.StdEncoding.EncodeToString([]byte(key))
|
||||||
|
r.HTTPRequest.Header.Set(h, b64v)
|
||||||
|
|
||||||
|
// Add MD5 if it wasn't computed
|
||||||
|
if r.HTTPRequest.Header.Get(md5h) == "" {
|
||||||
|
sum := md5.Sum([]byte(key))
|
||||||
|
b64sum := base64.StdEncoding.EncodeToString(sum[:])
|
||||||
|
r.HTTPRequest.Header.Set(md5h, b64sum)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
36
vendor/github.com/aws/aws-sdk-go/service/s3/statusok_error.go
generated
vendored
Normal file
36
vendor/github.com/aws/aws-sdk-go/service/s3/statusok_error.go
generated
vendored
Normal file
@ -0,0 +1,36 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"io/ioutil"
|
||||||
|
"net/http"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
func copyMultipartStatusOKUnmarhsalError(r *request.Request) {
|
||||||
|
b, err := ioutil.ReadAll(r.HTTPResponse.Body)
|
||||||
|
if err != nil {
|
||||||
|
r.Error = awserr.New("SerializationError", "unable to read response body", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
body := bytes.NewReader(b)
|
||||||
|
r.HTTPResponse.Body = aws.ReadSeekCloser(body)
|
||||||
|
defer r.HTTPResponse.Body.(aws.ReaderSeekerCloser).Seek(0, 0)
|
||||||
|
|
||||||
|
if body.Len() == 0 {
|
||||||
|
// If there is no body don't attempt to parse the body.
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
unmarshalError(r)
|
||||||
|
if err, ok := r.Error.(awserr.Error); ok && err != nil {
|
||||||
|
if err.Code() == "SerializationError" {
|
||||||
|
r.Error = nil
|
||||||
|
return
|
||||||
|
}
|
||||||
|
r.HTTPResponse.StatusCode = http.StatusServiceUnavailable
|
||||||
|
}
|
||||||
|
}
|
65
vendor/github.com/aws/aws-sdk-go/service/s3/unmarshal_error.go
generated
vendored
Normal file
65
vendor/github.com/aws/aws-sdk-go/service/s3/unmarshal_error.go
generated
vendored
Normal file
@ -0,0 +1,65 @@
|
|||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/xml"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"io/ioutil"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/aws/aws-sdk-go/aws"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/awserr"
|
||||||
|
"github.com/aws/aws-sdk-go/aws/request"
|
||||||
|
)
|
||||||
|
|
||||||
|
type xmlErrorResponse struct {
|
||||||
|
XMLName xml.Name `xml:"Error"`
|
||||||
|
Code string `xml:"Code"`
|
||||||
|
Message string `xml:"Message"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func unmarshalError(r *request.Request) {
|
||||||
|
defer r.HTTPResponse.Body.Close()
|
||||||
|
defer io.Copy(ioutil.Discard, r.HTTPResponse.Body)
|
||||||
|
|
||||||
|
// Bucket exists in a different region, and request needs
|
||||||
|
// to be made to the correct region.
|
||||||
|
if r.HTTPResponse.StatusCode == http.StatusMovedPermanently {
|
||||||
|
r.Error = awserr.NewRequestFailure(
|
||||||
|
awserr.New("BucketRegionError",
|
||||||
|
fmt.Sprintf("incorrect region, the bucket is not in '%s' region",
|
||||||
|
aws.StringValue(r.Config.Region)),
|
||||||
|
nil),
|
||||||
|
r.HTTPResponse.StatusCode,
|
||||||
|
r.RequestID,
|
||||||
|
)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var errCode, errMsg string
|
||||||
|
|
||||||
|
// Attempt to parse error from body if it is known
|
||||||
|
resp := &xmlErrorResponse{}
|
||||||
|
err := xml.NewDecoder(r.HTTPResponse.Body).Decode(resp)
|
||||||
|
if err != nil && err != io.EOF {
|
||||||
|
errCode = "SerializationError"
|
||||||
|
errMsg = "failed to decode S3 XML error response"
|
||||||
|
} else {
|
||||||
|
errCode = resp.Code
|
||||||
|
errMsg = resp.Message
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback to status code converted to message if still no error code
|
||||||
|
if len(errCode) == 0 {
|
||||||
|
statusText := http.StatusText(r.HTTPResponse.StatusCode)
|
||||||
|
errCode = strings.Replace(statusText, " ", "", -1)
|
||||||
|
errMsg = statusText
|
||||||
|
}
|
||||||
|
|
||||||
|
r.Error = awserr.NewRequestFailure(
|
||||||
|
awserr.New(errCode, errMsg, nil),
|
||||||
|
r.HTTPResponse.StatusCode,
|
||||||
|
r.RequestID,
|
||||||
|
)
|
||||||
|
}
|
139
vendor/github.com/aws/aws-sdk-go/service/s3/waiters.go
generated
vendored
Normal file
139
vendor/github.com/aws/aws-sdk-go/service/s3/waiters.go
generated
vendored
Normal file
@ -0,0 +1,139 @@
|
|||||||
|
// THIS FILE IS AUTOMATICALLY GENERATED. DO NOT EDIT.
|
||||||
|
|
||||||
|
package s3
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/aws/aws-sdk-go/private/waiter"
|
||||||
|
)
|
||||||
|
|
||||||
|
// WaitUntilBucketExists uses the Amazon S3 API operation
|
||||||
|
// HeadBucket to wait for a condition to be met before returning.
|
||||||
|
// If the condition is not meet within the max attempt window an error will
|
||||||
|
// be returned.
|
||||||
|
func (c *S3) WaitUntilBucketExists(input *HeadBucketInput) error {
|
||||||
|
waiterCfg := waiter.Config{
|
||||||
|
Operation: "HeadBucket",
|
||||||
|
Delay: 5,
|
||||||
|
MaxAttempts: 20,
|
||||||
|
Acceptors: []waiter.WaitAcceptor{
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 200,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 301,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 403,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
State: "retry",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 404,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
w := waiter.Waiter{
|
||||||
|
Client: c,
|
||||||
|
Input: input,
|
||||||
|
Config: waiterCfg,
|
||||||
|
}
|
||||||
|
return w.Wait()
|
||||||
|
}
|
||||||
|
|
||||||
|
// WaitUntilBucketNotExists uses the Amazon S3 API operation
|
||||||
|
// HeadBucket to wait for a condition to be met before returning.
|
||||||
|
// If the condition is not meet within the max attempt window an error will
|
||||||
|
// be returned.
|
||||||
|
func (c *S3) WaitUntilBucketNotExists(input *HeadBucketInput) error {
|
||||||
|
waiterCfg := waiter.Config{
|
||||||
|
Operation: "HeadBucket",
|
||||||
|
Delay: 5,
|
||||||
|
MaxAttempts: 20,
|
||||||
|
Acceptors: []waiter.WaitAcceptor{
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 404,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
w := waiter.Waiter{
|
||||||
|
Client: c,
|
||||||
|
Input: input,
|
||||||
|
Config: waiterCfg,
|
||||||
|
}
|
||||||
|
return w.Wait()
|
||||||
|
}
|
||||||
|
|
||||||
|
// WaitUntilObjectExists uses the Amazon S3 API operation
|
||||||
|
// HeadObject to wait for a condition to be met before returning.
|
||||||
|
// If the condition is not meet within the max attempt window an error will
|
||||||
|
// be returned.
|
||||||
|
func (c *S3) WaitUntilObjectExists(input *HeadObjectInput) error {
|
||||||
|
waiterCfg := waiter.Config{
|
||||||
|
Operation: "HeadObject",
|
||||||
|
Delay: 5,
|
||||||
|
MaxAttempts: 20,
|
||||||
|
Acceptors: []waiter.WaitAcceptor{
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 200,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
State: "retry",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 404,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
w := waiter.Waiter{
|
||||||
|
Client: c,
|
||||||
|
Input: input,
|
||||||
|
Config: waiterCfg,
|
||||||
|
}
|
||||||
|
return w.Wait()
|
||||||
|
}
|
||||||
|
|
||||||
|
// WaitUntilObjectNotExists uses the Amazon S3 API operation
|
||||||
|
// HeadObject to wait for a condition to be met before returning.
|
||||||
|
// If the condition is not meet within the max attempt window an error will
|
||||||
|
// be returned.
|
||||||
|
func (c *S3) WaitUntilObjectNotExists(input *HeadObjectInput) error {
|
||||||
|
waiterCfg := waiter.Config{
|
||||||
|
Operation: "HeadObject",
|
||||||
|
Delay: 5,
|
||||||
|
MaxAttempts: 20,
|
||||||
|
Acceptors: []waiter.WaitAcceptor{
|
||||||
|
{
|
||||||
|
State: "success",
|
||||||
|
Matcher: "status",
|
||||||
|
Argument: "",
|
||||||
|
Expected: 404,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
w := waiter.Waiter{
|
||||||
|
Client: c,
|
||||||
|
Input: input,
|
||||||
|
Config: waiterCfg,
|
||||||
|
}
|
||||||
|
return w.Wait()
|
||||||
|
}
|
8
vendor/vendor.json
vendored
8
vendor/vendor.json
vendored
@ -298,6 +298,14 @@
|
|||||||
"version": "v1.5.8",
|
"version": "v1.5.8",
|
||||||
"versionExact": "v1.5.8"
|
"versionExact": "v1.5.8"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"checksumSHA1": "HtKiIAPKsBg2s1c5ytRkdZ/lqO8=",
|
||||||
|
"path": "github.com/aws/aws-sdk-go/service/s3",
|
||||||
|
"revision": "898c81ba64b9a467379d35e3fabad133beae0ee4",
|
||||||
|
"revisionTime": "2016-11-18T23:08:35Z",
|
||||||
|
"version": "v1.5.8",
|
||||||
|
"versionExact": "v1.5.8"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"checksumSHA1": "ouwhxcAsIYQ6oJbMRdLW/Ys/iyg=",
|
"checksumSHA1": "ouwhxcAsIYQ6oJbMRdLW/Ys/iyg=",
|
||||||
"path": "github.com/aws/aws-sdk-go/service/sts",
|
"path": "github.com/aws/aws-sdk-go/service/sts",
|
||||||
|
Loading…
Reference in New Issue
Block a user