mirror of
https://github.com/grafana/grafana.git
synced 2025-02-25 18:55:37 -06:00
Merge branch 'master' into getting-started-panel
This commit is contained in:
47
CHANGELOG.md
47
CHANGELOG.md
@@ -1,9 +1,53 @@
|
||||
# 4.0-stable (unrelased)
|
||||
# 4.1-beta (unreleased)
|
||||
|
||||
### Enhancements
|
||||
* **Postgres**: Add support for Certs for Postgres database [#6655](https://github.com/grafana/grafana/issues/6655)
|
||||
* **Victorops**: Add VictorOps notification integration [#6411](https://github.com/grafana/grafana/issues/6411)
|
||||
* **Opsgenie**: Add OpsGenie notification integratiion (by [@kylemcc](https://github.com/kylemcc)) [#6687](https://github.com/grafana/grafana/issues/6687)
|
||||
* **Singlestat**: New aggregation on singlestat panel [#6740](https://github.com/grafana/grafana/pull/6740)
|
||||
* **Cloudwatch**: Make it possible to specify access and secret key on the data source config page [#6697](https://github.com/grafana/grafana/issues/6697)
|
||||
* **Table**: Added Hidden Column Style for Table Panel [#5677](https://github.com/grafana/grafana/pull/5677)
|
||||
* **Graph**: Shared crosshair option renamed to shared tooltip, shows tooltip on all graphs as you hover over one graph. [#1578](https://github.com/grafana/grafana/pull/1578), [#6274](https://github.com/grafana/grafana/pull/6274)
|
||||
* **Elasticsearch**: Added support for Missing option (bucket) for terms aggregation [#4244](https://github.com/grafana/grafana/pull/4244), thx @shanielh
|
||||
|
||||
### Bugfixes
|
||||
* **API**: HTTP API for deleting org returning incorrect message for a non-existing org [#6679](https://github.com/grafana/grafana/issues/6679)
|
||||
* **Dashboard**: Posting empty dashboard result in corrupted dashboard [#5443](https://github.com/grafana/grafana/issues/5443)
|
||||
|
||||
# 4.0.2 (unreleased)
|
||||
|
||||
### Enhancements
|
||||
* **Playlist**: Add support for kiosk mode [#6727](https://github.com/grafana/grafana/issues/6727)
|
||||
|
||||
### Bugfixes
|
||||
* **Alerting**: Add alert message to webhook notifications [#6807](https://github.com/grafana/grafana/issues/6807)
|
||||
* **PNG Rendering**: Fix for server side rendering when using non default http addr bind and domain setting [#6813](https://github.com/grafana/grafana/issues/6813)
|
||||
* **PNG Rendering**: Fix for server side rendering when setting enforce_domain to true [#6769](https://github.com/grafana/grafana/issues/6769)
|
||||
* **Webhooks**: Add content type json to outgoing webhooks [#6822](https://github.com/grafana/grafana/issues/6822)
|
||||
* **Keyboard shortcut**: Fixed zoom out shortcut [#6837](https://github.com/grafana/grafana/issues/6837)
|
||||
* **Webdav**: Adds basic auth headers to webdav uploader [#6779](https://github.com/grafana/grafana/issues/6779)
|
||||
|
||||
# 4.0.1 (2016-12-02)
|
||||
|
||||
> **Notice**
|
||||
4.0.0 had serious connection pooling issue when using a data source in proxy access. This bug caused lots of resource issues
|
||||
due to too many connections/file handles on the data source backend. This problem is fixed in this release.
|
||||
|
||||
### Bugfixes
|
||||
* **Metrics**: Fixes nil pointer dereference on my arm build [#6749](https://github.com/grafana/grafana/issues/6749)
|
||||
* **Data proxy**: Fixes a tcp pooling issue in the datasource reverse proxy [#6759](https://github.com/grafana/grafana/issues/6759)
|
||||
|
||||
# 4.0-stable (2016-11-29)
|
||||
|
||||
### Bugfixes
|
||||
* **Server-side rendering**: Fixed address used when rendering panel via phantomjs and using non default http_addr config [#6660](https://github.com/grafana/grafana/issues/6660)
|
||||
* **Graph panel**: Fixed graph panel tooltip sort order issue [#6648](https://github.com/grafana/grafana/issues/6648)
|
||||
* **Unsaved changes**: You now navigate to the intended page after saving in the unsaved changes dialog [#6675](https://github.com/grafana/grafana/issues/6675)
|
||||
* **TLS Client Auth**: Support for TLS client authentication for datasource proxies [#2316](https://github.com/grafana/grafana/issues/2316)
|
||||
* **Alerts out of sync**: Saving dashboards with broken alerts causes sync problem[#6576](https://github.com/grafana/grafana/issues/6576)
|
||||
* **Alerting**: Saving an alert with condition "HAS NO DATA" throws an error[#6701](https://github.com/grafana/grafana/issues/6701)
|
||||
* **Config**: Improve error message when parsing broken config file [#6731](https://github.com/grafana/grafana/issues/6731)
|
||||
* **Table**: Render empty dates as - instead of current date [#6728](https://github.com/grafana/grafana/issues/6728)
|
||||
|
||||
# 4.0-beta2 (2016-11-21)
|
||||
|
||||
@@ -19,6 +63,7 @@
|
||||
* **Singlestat**: Support repeated template variables in prefix/postfix [#6595](https://github.com/grafana/grafana/issues/6595)
|
||||
* **Templating**: Don't persist variable options with refresh option [#6586](https://github.com/grafana/grafana/issues/6586)
|
||||
* **Alerting**: Add ability to have OR conditions (and mixing AND & OR) [#6579](https://github.com/grafana/grafana/issues/6579)
|
||||
* **InfluxDB**: Fix for Ad-Hoc Filters variable & changing dashboards [#6821](https://github.com/grafana/grafana/issues/6821)
|
||||
|
||||
# 4.0-beta1 (2016-11-09)
|
||||
|
||||
|
||||
@@ -17,7 +17,7 @@ Graphite, Elasticsearch, OpenTSDB, Prometheus and InfluxDB.
|
||||
- [What's New in Grafana 2.1](http://docs.grafana.org/guides/whats-new-in-v2-1/)
|
||||
- [What's New in Grafana 2.5](http://docs.grafana.org/guides/whats-new-in-v2-5/)
|
||||
- [What's New in Grafana 3.0](http://docs.grafana.org/guides/whats-new-in-v3/)
|
||||
- [What's New in Grafana 4.0 Beta](http://docs.grafana.org/guides/whats-new-in-v4/)
|
||||
- [What's New in Grafana 4.0](http://docs.grafana.org/guides/whats-new-in-v4/)
|
||||
|
||||
## Features
|
||||
### Graphite Target Editor
|
||||
@@ -79,7 +79,7 @@ the latest master builds [here](http://grafana.org/builds)
|
||||
|
||||
### Dependencies
|
||||
|
||||
- Go 1.7
|
||||
- Go 1.7.3
|
||||
- NodeJS v4+
|
||||
|
||||
### Get Code
|
||||
@@ -155,10 +155,6 @@ If you have any idea for an improvement or found a bug do not hesitate to open a
|
||||
And if you have time clone this repo and submit a pull request and help me make Grafana
|
||||
the kickass metrics & devops dashboard we all dream about!
|
||||
|
||||
Before creating a pull request be sure that "grunt test" runs without any style or unit test errors, also
|
||||
please [sign the CLA](http://docs.grafana.org/project/cla/)
|
||||
|
||||
## License
|
||||
|
||||
Grafana is distributed under Apache 2.0 License.
|
||||
Work in progress Grafana 2.0 (with included Grafana backend)
|
||||
|
||||
@@ -5,7 +5,7 @@ machine:
|
||||
GOPATH: "/home/ubuntu/.go_workspace"
|
||||
ORG_PATH: "github.com/grafana"
|
||||
REPO_PATH: "${ORG_PATH}/grafana"
|
||||
GODIST: "go1.7.3.linux-amd64.tar.gz"
|
||||
GODIST: "go1.7.4.linux-amd64.tar.gz"
|
||||
post:
|
||||
- mkdir -p download
|
||||
- test -e download/$GODIST || curl -o download/$GODIST https://storage.googleapis.com/golang/$GODIST
|
||||
|
||||
@@ -67,6 +67,7 @@ type = sqlite3
|
||||
host = 127.0.0.1:3306
|
||||
name = grafana
|
||||
user = root
|
||||
# If the password contains # or ; you have to wrap it with trippel quotes. Ex """#password;"""
|
||||
password =
|
||||
# Use either URL or the previous fields to configure the database
|
||||
# Example: mysql://user:secret@host:port/database
|
||||
@@ -289,7 +290,7 @@ templates_pattern = emails/*.html
|
||||
[log]
|
||||
# Either "console", "file", "syslog". Default is console and file
|
||||
# Use space to separate multiple modes, e.g. "console file"
|
||||
mode = console, file
|
||||
mode = console file
|
||||
|
||||
# Either "debug", "info", "warn", "error", "critical", default is "info"
|
||||
level = info
|
||||
|
||||
@@ -69,6 +69,7 @@
|
||||
;host = 127.0.0.1:3306
|
||||
;name = grafana
|
||||
;user = root
|
||||
# If the password contains # or ; you have to wrap it with trippel quotes. Ex """#password;"""
|
||||
;password =
|
||||
|
||||
# Use either URL or the previous fields to configure the database
|
||||
@@ -272,7 +273,7 @@
|
||||
[log]
|
||||
# Either "console", "file", "syslog". Default is console and file
|
||||
# Use space to separate multiple modes, e.g. "console file"
|
||||
;mode = console, file
|
||||
;mode = console file
|
||||
|
||||
# Either "trace", "debug", "info", "warn", "error", "critical", default is "info"
|
||||
;level = info
|
||||
|
||||
2
docker/blocks/elastic5/elasticsearch.yml
Normal file
2
docker/blocks/elastic5/elasticsearch.yml
Normal file
@@ -0,0 +1,2 @@
|
||||
script.inline: on
|
||||
script.indexed: on
|
||||
8
docker/blocks/elastic5/fig
Normal file
8
docker/blocks/elastic5/fig
Normal file
@@ -0,0 +1,8 @@
|
||||
# You need to run 'sysctl -w vm.max_map_count=262144' on the host machine
|
||||
|
||||
elasticsearch5:
|
||||
image: elasticsearch:5
|
||||
command: elasticsearch
|
||||
ports:
|
||||
- "9200:9200"
|
||||
- "9300:9300"
|
||||
23
docs/sources/alerting/metrics.md
Normal file
23
docs/sources/alerting/metrics.md
Normal file
@@ -0,0 +1,23 @@
|
||||
+++
|
||||
title = "Alerting Metrics"
|
||||
description = "Alerting Metrics Guide"
|
||||
keywords = ["Grafana", "alerting", "guide", "metrics"]
|
||||
type = "docs"
|
||||
[menu.docs]
|
||||
name = "Metrics"
|
||||
parent = "alerting"
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
# Metrics from the alert engine
|
||||
|
||||
> Alerting is only available in Grafana v4.0 and above.
|
||||
|
||||
The alert engine publish some internal metrics about itself. You can read more about how Grafana published [interal metrics](/installation/configuration/#metrics)
|
||||
|
||||
Description | Type | Metric name
|
||||
---------- | ----------- | ----------
|
||||
Total number of alerts | counter | `alerting.active_alerts`
|
||||
Alert execution result | counter | `alerting.result`
|
||||
Notifications sent counter | counter | `alerting.notifications_sent`
|
||||
Alert execution timer | timer | `alerting.execution_time`
|
||||
@@ -91,7 +91,7 @@ Auto resolve incidents | Resolve incidents in pagerduty once the alert goes back
|
||||
|
||||
# Enable images in notifications {#external-image-store}
|
||||
|
||||
Grafan can render the panel associated with the alert rule and include that in the notification. Some types
|
||||
Grafana can render the panel associated with the alert rule and include that in the notification. Some types
|
||||
of notifications require that this image be publicly accessable (Slack for example). In order to support
|
||||
images in notifications like Slack Grafana can upload the image to an image store. It currently supports
|
||||
Amazon S3 for this and Webdav. So to set that up you need to configure the
|
||||
|
||||
@@ -8,4 +8,24 @@ parent = "features"
|
||||
weight = 5
|
||||
+++
|
||||
|
||||
# Data Source Overview
|
||||
Grafana supports many different storage backends for your time series data (Data Source). Each Data Source has a specific Query Editor that is customized for the features and capabilities that the particular Data Source exposes.
|
||||
|
||||
|
||||
## Querying
|
||||
The query language and capabilities of each Data Source are obviously very different. You can combine data from multiple Data Sources onto a single Dashboard, but each Panel is tied to a specific Data Source that belongs to a particular Organization.
|
||||
|
||||
## Supported Data Sources
|
||||
The following datasources are officially supported:
|
||||
|
||||
* [Graphite]({{< relref "graphite.md" >}})
|
||||
* [Elasticsearch]({{< relref "elasticsearch.md" >}})
|
||||
* [CloudWatch]({{< relref "cloudwatch.md" >}})
|
||||
* [InfluxDB]({{< relref "influxdb.md" >}})
|
||||
* [OpenTSDB]({{< relref "opentsdb.md" >}})
|
||||
* [Prometheus]({{< relref "prometheus.md" >}})
|
||||
|
||||
## Data source plugins
|
||||
|
||||
Since grafana 3.0 you can install data sources as plugins. Checkout [Grafana.net](https://grafana.net/plugins) for more data sources.
|
||||
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
----
|
||||
page_title: Data Source Overview
|
||||
page_description: Data Source Overview
|
||||
page_keywords: grafana, graphite, influxDB, KairosDB, OpenTSDB, Prometheus, documentation
|
||||
---
|
||||
|
||||
# Data Source Overview
|
||||
Grafana supports many different storage backends for your time series data (Data Source). Each Data Source has a specific Query Editor that is customized for the features and capabilities that the particular Data Source exposes.
|
||||
|
||||
|
||||
## Querying
|
||||
The query language and capabilities of each Data Source are obviously very different. You can combine data from multiple Data Sources onto a single Dashboard, but each Panel is tied to a specific Data Source that belongs to a particular Organization.
|
||||
|
||||
## Supported Data Sources
|
||||
The following datasources are officially supported:
|
||||
|
||||
* [Graphite](/datasources/graphite/)
|
||||
* [Elasticsearch](/datasources/elasticsearch/)
|
||||
* [CloudWatch](/datasources/cloudwatch/)
|
||||
* [InfluxDB](/datasources/influxdb/)
|
||||
* [OpenTSDB](/datasources/opentsdb/)
|
||||
* [KairosDB](/datasources/kairosdb)
|
||||
* [Prometheus](/datasources/prometheus)
|
||||
|
||||
Grafana can query any Elasticsearch index for annotation events, but at this time, it's not supported for metric queries. Learn more about [annotations](/reference/annotations/#elasticsearch-annotations)
|
||||
@@ -1,7 +1,29 @@
|
||||
+++
|
||||
title = "HTTP API"
|
||||
description = "Grafana HTTP API"
|
||||
keywords = ["grafana", "http", "documentation", "api", "overview"]
|
||||
type = "docs"
|
||||
[menu.docs]
|
||||
name = "HTTP API"
|
||||
identifier = "http_api"
|
||||
weight = 9
|
||||
+++
|
||||
|
||||
|
||||
# HTTP API Reference
|
||||
|
||||
The Grafana backend exposes an HTTP API, the same API is used by the frontend to do everything from saving
|
||||
dashboards, creating users and updating data sources.
|
||||
|
||||
## Supported HTTP APIs:
|
||||
|
||||
|
||||
* [Authentication API]({{< relref "auth.md" >}})
|
||||
* [Dashboard API]({{< relref "dashboard.md" >}})
|
||||
* [Data Source API]({{< relref "data_source.md" >}})
|
||||
* [Organisation API]({{< relref "org.md" >}})
|
||||
* [User API]({{< relref "user.md" >}})
|
||||
* [Admin API]({{< relref "admin.md" >}})
|
||||
* [Snapshot API]({{< relref "snapshot.md" >}})
|
||||
* [Preferences API]({{< relref "preferences.md" >}})
|
||||
* [Other API]({{< relref "other.md" >}})
|
||||
|
||||
@@ -1,28 +0,0 @@
|
||||
+++
|
||||
title = "HTTP API "
|
||||
description = "Grafana HTTP API"
|
||||
keywords = ["grafana", "http", "documentation", "api", "overview"]
|
||||
aliases = ["/http_api/overview/"]
|
||||
type = "docs"
|
||||
[menu.docs]
|
||||
name = "Overview"
|
||||
parent = "http_api"
|
||||
+++
|
||||
|
||||
|
||||
# HTTP API Reference
|
||||
|
||||
The Grafana backend exposes an HTTP API, the same API is used by the frontend to do everything from saving
|
||||
dashboards, creating users and updating data sources.
|
||||
|
||||
## Supported HTTP APIs:
|
||||
|
||||
* [Authentication API](/http_api/auth/)
|
||||
* [Dashboard API](/http_api/dashboard/)
|
||||
* [Data Source API](/http_api/data_source/)
|
||||
* [Organisation API](/http_api/org/)
|
||||
* [User API](/http_api/user/)
|
||||
* [Admin API](/http_api/admin/)
|
||||
* [Snapshot API](/http_api/snapshot/)
|
||||
* [Preferences API](/http_api/preferences/)
|
||||
* [Other API](/http_api/other/)
|
||||
@@ -170,7 +170,7 @@ The database user (not applicable for `sqlite3`).
|
||||
|
||||
### password
|
||||
|
||||
The database user's password (not applicable for `sqlite3`).
|
||||
The database user's password (not applicable for `sqlite3`). If the password contains `#` or `;` you have to wrap it with trippel quotes. Ex `"""#password;"""`
|
||||
|
||||
### ssl_mode
|
||||
|
||||
|
||||
@@ -14,23 +14,16 @@ weight = 1
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Stable for Debian-based Linux | [3.1.1 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_3.1.1-1470047149_amd64.deb)
|
||||
Latest Beta for Debian-based Linux | [4.0.0-beta2 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.0-1479719016beta2_amd64.deb)
|
||||
Stable for Debian-based Linux | [4.0.1 (x86-64 deb)](https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.1-1480694114_amd64.deb)
|
||||
|
||||
## Install Stable
|
||||
|
||||
```
|
||||
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_3.1.1-1470047149_amd64.deb
|
||||
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.1-1480694114_amd64.deb
|
||||
$ sudo apt-get install -y adduser libfontconfig
|
||||
$ sudo dpkg -i grafana_3.1.1-1470047149_amd64.deb
|
||||
$ sudo dpkg -i grafana_4.0.1-1480694114_amd64.deb
|
||||
```
|
||||
|
||||
## Install Latest Beta
|
||||
|
||||
$ wget https://grafanarel.s3.amazonaws.com/builds/grafana_4.0.0-1479719016beta2_amd64.deb
|
||||
$ sudo apt-get install -y adduser libfontconfig
|
||||
$ sudo dpkg -i grafana_4.0.0-1479719016beta2_amd64.deb
|
||||
|
||||
## APT Repository
|
||||
|
||||
Add the following line to your `/etc/apt/sources.list` file.
|
||||
|
||||
@@ -14,40 +14,24 @@ weight = 2
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [3.1.1 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-3.1.1-1470047149.x86_64.rpm)
|
||||
Latest Beta for CentOS / Fedora / OpenSuse / Redhat Linux | [4.0.0-beta2 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.0-1479719016beta2.x86_64.rpm)
|
||||
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [4.0.1 (x86-64 rpm)](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.1-1480694114.x86_64.rpm)
|
||||
|
||||
## Install Stable
|
||||
|
||||
You can install Grafana using Yum directly.
|
||||
|
||||
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-3.1.1-1470047149.x86_64.rpm
|
||||
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.1-1480694114.x86_64.rpm
|
||||
|
||||
Or install manually using `rpm`.
|
||||
|
||||
#### On CentOS / Fedora / Redhat:
|
||||
|
||||
$ sudo yum install initscripts fontconfig
|
||||
$ sudo rpm -Uvh grafana-3.1.1-1470047149.x86_64.rpm
|
||||
$ sudo rpm -Uvh grafana-4.0.1-1480694114.x86_64.rpm
|
||||
|
||||
#### On OpenSuse:
|
||||
|
||||
$ sudo rpm -i --nodeps grafana-3.1.1-1470047149.x86_64.rpm
|
||||
|
||||
## Or Install Latest Beta
|
||||
|
||||
$ sudo yum install https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.0-1479719016beta2.x86_64.rpm
|
||||
|
||||
Or install manually using `rpm`.
|
||||
|
||||
#### On CentOS / Fedora / Redhat:
|
||||
|
||||
$ sudo yum install initscripts fontconfig
|
||||
$ sudo rpm -Uvh grafana-4.0.0-1479719016beta2.x86_64.rpm
|
||||
|
||||
#### On OpenSuse:
|
||||
|
||||
$ sudo rpm -i --nodeps grafana-4.0.0-1479719016beta2.x86_64.rpm
|
||||
$ sudo rpm -i --nodeps grafana-4.0.1-1480694114.x86_64.rpm
|
||||
|
||||
## Install via YUM Repository
|
||||
|
||||
|
||||
@@ -13,9 +13,7 @@ weight = 3
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Latest stable package for Windows | [grafana.3.1.1.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-3.1.1.windows-x64.zip)
|
||||
Latest beta package for Windows | [grafana.4.0.0-beta2.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.0-beta2.windows-x64.zip)
|
||||
|
||||
Latest stable package for Windows | [grafana.4.0.1.windows-x64.zip](https://grafanarel.s3.amazonaws.com/builds/grafana-4.0.1.windows-x64.zip)
|
||||
|
||||
## Configure
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ The Datasource should contain the following functions.
|
||||
```
|
||||
query(options) //used by panels to get data
|
||||
testDatasource() //used by datasource configuration page to make sure the connection is working
|
||||
annotationsQuery(options) // used dashboards to get annotations
|
||||
annotationsQuery(options) // used by dashboards to get annotations
|
||||
metricFindQuery(options) // used by query editor to get metric suggestions.
|
||||
```
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@ dev environment. Grafana ships with its own required backend server; also comple
|
||||
|
||||
## Dependencies
|
||||
|
||||
- [Go 1.7](https://golang.org/dl/)
|
||||
- [Go 1.7.3](https://golang.org/dl/)
|
||||
- [NodeJS](https://nodejs.org/download/)
|
||||
|
||||
## Get Code
|
||||
@@ -91,6 +91,3 @@ Learn more about Grafana config options in the [Configuration section](/installa
|
||||
|
||||
## Create a pull requests
|
||||
Please contribute to the Grafana project and submit a pull request! Build new features, write or update documentation, fix bugs and generally make Grafana even more awesome.
|
||||
|
||||
Before or after you create a pull request, sign the [contributor license agreement](/project/cla.html).
|
||||
Together we can build amazing software faster.
|
||||
|
||||
@@ -48,12 +48,6 @@ populate the template variable to a desired value from the link.
|
||||
The metrics tab defines what series data and sources to render. Each datasource provides different
|
||||
options.
|
||||
|
||||
### Graphite
|
||||
|
||||
### InfluxDB
|
||||
|
||||
### OpenTSDB
|
||||
|
||||
## Axes & Grid
|
||||
|
||||

|
||||
@@ -71,9 +65,6 @@ The ``Left Y`` and ``Right Y`` can be customized using:
|
||||
|
||||
Axes can also be hidden by unchecking the appropriate box from `Show Axis`.
|
||||
|
||||
Thresholds allow you to add arbitrary lines or sections to the graph to make it easier to see when
|
||||
the graph crosses a particular threshold.
|
||||
|
||||
### Legend
|
||||
|
||||
The legend hand be hidden by checking the ``Show`` checkbox. If it's shown, it can be
|
||||
@@ -103,6 +94,12 @@ It is just the sum of all data points received by Grafana.
|
||||
|
||||
Display styles controls properties of the graph.
|
||||
|
||||
### Thresholds
|
||||
|
||||
Thresholds allow you to add arbitrary lines or sections to the graph to make it easier to see when
|
||||
the graph crosses a particular threshold.
|
||||
|
||||
|
||||
### Chart Options
|
||||
|
||||
- ``Bar`` - Display values as a bar chart
|
||||
|
||||
@@ -23,7 +23,15 @@ The singlestat panel has a normal query editor to allow you define your exact me
|
||||
|
||||
1. `Big Value`: Big Value refers to how we display the main stat for the Singlestat Panel. This is always a single value that is displayed in the Panel in between two strings, `Prefix` and `Suffix`. The single number is calculated by choosing a function (min,max,average,current,total) of your metric query. This functions reduces your query into a single numeric value.
|
||||
2. `Font Size`: You can use this section to select the font size of the different texts in the Singlestat Panel, i.e. prefix, value and postfix.
|
||||
3. `Values`: The Value fields let you set the function (min, max, average, current, total) that your entire query is reduced into a single value with. You can also set the font size of the Value field and font-size (as a %) of the metric query that the Panel is configured with. This reduces the entire query into a single summary value that is displayed.
|
||||
3. `Values`: The Value fields let you set the function (min, max, average, current, total, first, delta, range) that your entire query is reduced into a single value with. You can also set the font size of the Value field and font-size (as a %) of the metric query that the Panel is configured with. This reduces the entire query into a single summary value that is displayed.
|
||||
* `min` - The smallest value in the series
|
||||
* `max` - The largest value in the series
|
||||
* `average` - The average of all the non-null values in the series
|
||||
* `current` - The last value in the series. If the series ends on null the previous value will be used.
|
||||
* `total` - The sum of all the non-null values in the series
|
||||
* `first` - The first value in the series
|
||||
* `delta` - The total incremental increase (of a counter) in the series. An attempt is made to account for counter resets, but this will only be accurate for single instance metrics. Used to show total counter increase in time series.
|
||||
* `range` - The difference between 'min' and 'max'. Useful the show the range of change for a gauge.
|
||||
4. `Postfixes`: The Postfix fields let you define a custom label and font-size (as a %) to appear *after* the value
|
||||
5. `Units`: Units are appended to the the Singlestat within the panel, and will respect the color and threshold settings for the value.
|
||||
6. `Decimals`: The Decimal field allows you to override the automatic decimal precision, and set it explicitly.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
{
|
||||
"stable": "3.1.1",
|
||||
"testing": "3.1.1"
|
||||
"stable": "4.0.1",
|
||||
"testing": "4.0.1"
|
||||
}
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
"company": "Coding Instinct AB"
|
||||
},
|
||||
"name": "grafana",
|
||||
"version": "4.0.0-beta2",
|
||||
"version": "4.1.0-pre1",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "http://github.com/grafana/grafana.git"
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
#! /usr/bin/env bash
|
||||
|
||||
deb_ver=3.1.1-1470047149
|
||||
rpm_ver=3.1.1-1470047149
|
||||
deb_ver=4.0.0-1480439068
|
||||
rpm_ver=4.0.0-1480439068
|
||||
|
||||
wget https://grafanarel.s3.amazonaws.com/builds/grafana_${deb_ver}_amd64.deb
|
||||
|
||||
|
||||
@@ -1,6 +1,11 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"crypto/tls"
|
||||
"net"
|
||||
"net/http"
|
||||
"time"
|
||||
|
||||
"gopkg.in/macaron.v1"
|
||||
|
||||
"github.com/grafana/grafana/pkg/api/pluginproxy"
|
||||
@@ -11,6 +16,16 @@ import (
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
var pluginProxyTransport = &http.Transport{
|
||||
TLSClientConfig: &tls.Config{InsecureSkipVerify: true},
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
Dial: (&net.Dialer{
|
||||
Timeout: 30 * time.Second,
|
||||
KeepAlive: 30 * time.Second,
|
||||
}).Dial,
|
||||
TLSHandshakeTimeout: 10 * time.Second,
|
||||
}
|
||||
|
||||
func InitAppPluginRoutes(r *macaron.Macaron) {
|
||||
for _, plugin := range plugins.Apps {
|
||||
for _, route := range plugin.Routes {
|
||||
@@ -40,7 +55,7 @@ func AppPluginRoute(route *plugins.AppPluginRoute, appId string) macaron.Handler
|
||||
path := c.Params("*")
|
||||
|
||||
proxy := pluginproxy.NewApiPluginProxy(c, path, route, appId)
|
||||
proxy.Transport = dataProxyTransport
|
||||
proxy.Transport = pluginProxyTransport
|
||||
proxy.ServeHTTP(c.Resp, c.Req.Request)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -33,6 +33,39 @@ type cwRequest struct {
|
||||
DataSource *m.DataSource
|
||||
}
|
||||
|
||||
type datasourceInfo struct {
|
||||
Profile string
|
||||
Region string
|
||||
AssumeRoleArn string
|
||||
Namespace string
|
||||
|
||||
AccessKey string
|
||||
SecretKey string
|
||||
}
|
||||
|
||||
func (req *cwRequest) GetDatasourceInfo() *datasourceInfo {
|
||||
assumeRoleArn := req.DataSource.JsonData.Get("assumeRoleArn").MustString()
|
||||
accessKey := ""
|
||||
secretKey := ""
|
||||
|
||||
for key, value := range req.DataSource.SecureJsonData.Decrypt() {
|
||||
if key == "accessKey" {
|
||||
accessKey = value
|
||||
}
|
||||
if key == "secretKey" {
|
||||
secretKey = value
|
||||
}
|
||||
}
|
||||
|
||||
return &datasourceInfo{
|
||||
AssumeRoleArn: assumeRoleArn,
|
||||
Region: req.Region,
|
||||
Profile: req.DataSource.Database,
|
||||
AccessKey: accessKey,
|
||||
SecretKey: secretKey,
|
||||
}
|
||||
}
|
||||
|
||||
func init() {
|
||||
actionHandlers = map[string]actionHandler{
|
||||
"GetMetricStatistics": handleGetMetricStatistics,
|
||||
@@ -56,8 +89,8 @@ type cache struct {
|
||||
var awsCredentialCache map[string]cache = make(map[string]cache)
|
||||
var credentialCacheLock sync.RWMutex
|
||||
|
||||
func getCredentials(profile string, region string, assumeRoleArn string) *credentials.Credentials {
|
||||
cacheKey := profile + ":" + assumeRoleArn
|
||||
func getCredentials(dsInfo *datasourceInfo) *credentials.Credentials {
|
||||
cacheKey := dsInfo.Profile + ":" + dsInfo.AssumeRoleArn
|
||||
credentialCacheLock.RLock()
|
||||
if _, ok := awsCredentialCache[cacheKey]; ok {
|
||||
if awsCredentialCache[cacheKey].expiration != nil &&
|
||||
@@ -74,9 +107,9 @@ func getCredentials(profile string, region string, assumeRoleArn string) *creden
|
||||
sessionToken := ""
|
||||
var expiration *time.Time
|
||||
expiration = nil
|
||||
if strings.Index(assumeRoleArn, "arn:aws:iam:") == 0 {
|
||||
if strings.Index(dsInfo.AssumeRoleArn, "arn:aws:iam:") == 0 {
|
||||
params := &sts.AssumeRoleInput{
|
||||
RoleArn: aws.String(assumeRoleArn),
|
||||
RoleArn: aws.String(dsInfo.AssumeRoleArn),
|
||||
RoleSessionName: aws.String("GrafanaSession"),
|
||||
DurationSeconds: aws.Int64(900),
|
||||
}
|
||||
@@ -85,13 +118,14 @@ func getCredentials(profile string, region string, assumeRoleArn string) *creden
|
||||
stsCreds := credentials.NewChainCredentials(
|
||||
[]credentials.Provider{
|
||||
&credentials.EnvProvider{},
|
||||
&credentials.SharedCredentialsProvider{Filename: "", Profile: profile},
|
||||
&credentials.SharedCredentialsProvider{Filename: "", Profile: dsInfo.Profile},
|
||||
&ec2rolecreds.EC2RoleProvider{Client: ec2metadata.New(stsSess), ExpiryWindow: 5 * time.Minute},
|
||||
})
|
||||
stsConfig := &aws.Config{
|
||||
Region: aws.String(region),
|
||||
Region: aws.String(dsInfo.Region),
|
||||
Credentials: stsCreds,
|
||||
}
|
||||
|
||||
svc := sts.New(session.New(stsConfig), stsConfig)
|
||||
resp, err := svc.AssumeRole(params)
|
||||
if err != nil {
|
||||
@@ -115,9 +149,14 @@ func getCredentials(profile string, region string, assumeRoleArn string) *creden
|
||||
SessionToken: sessionToken,
|
||||
}},
|
||||
&credentials.EnvProvider{},
|
||||
&credentials.SharedCredentialsProvider{Filename: "", Profile: profile},
|
||||
&credentials.StaticProvider{Value: credentials.Value{
|
||||
AccessKeyID: dsInfo.AccessKey,
|
||||
SecretAccessKey: dsInfo.SecretKey,
|
||||
}},
|
||||
&credentials.SharedCredentialsProvider{Filename: "", Profile: dsInfo.Profile},
|
||||
&ec2rolecreds.EC2RoleProvider{Client: ec2metadata.New(sess), ExpiryWindow: 5 * time.Minute},
|
||||
})
|
||||
|
||||
credentialCacheLock.Lock()
|
||||
awsCredentialCache[cacheKey] = cache{
|
||||
credential: creds,
|
||||
@@ -129,10 +168,9 @@ func getCredentials(profile string, region string, assumeRoleArn string) *creden
|
||||
}
|
||||
|
||||
func getAwsConfig(req *cwRequest) *aws.Config {
|
||||
assumeRoleArn := req.DataSource.JsonData.Get("assumeRoleArn").MustString()
|
||||
cfg := &aws.Config{
|
||||
Region: aws.String(req.Region),
|
||||
Credentials: getCredentials(req.DataSource.Database, req.Region, assumeRoleArn),
|
||||
Credentials: getCredentials(req.GetDatasourceInfo()),
|
||||
}
|
||||
return cfg
|
||||
}
|
||||
@@ -143,25 +181,33 @@ func handleGetMetricStatistics(req *cwRequest, c *middleware.Context) {
|
||||
|
||||
reqParam := &struct {
|
||||
Parameters struct {
|
||||
Namespace string `json:"namespace"`
|
||||
MetricName string `json:"metricName"`
|
||||
Dimensions []*cloudwatch.Dimension `json:"dimensions"`
|
||||
Statistics []*string `json:"statistics"`
|
||||
StartTime int64 `json:"startTime"`
|
||||
EndTime int64 `json:"endTime"`
|
||||
Period int64 `json:"period"`
|
||||
Namespace string `json:"namespace"`
|
||||
MetricName string `json:"metricName"`
|
||||
Dimensions []*cloudwatch.Dimension `json:"dimensions"`
|
||||
Statistics []*string `json:"statistics"`
|
||||
ExtendedStatistics []*string `json:"extendedStatistics"`
|
||||
StartTime int64 `json:"startTime"`
|
||||
EndTime int64 `json:"endTime"`
|
||||
Period int64 `json:"period"`
|
||||
} `json:"parameters"`
|
||||
}{}
|
||||
json.Unmarshal(req.Body, reqParam)
|
||||
|
||||
params := &cloudwatch.GetMetricStatisticsInput{
|
||||
Namespace: aws.String(reqParam.Parameters.Namespace),
|
||||
MetricName: aws.String(reqParam.Parameters.MetricName),
|
||||
Dimensions: reqParam.Parameters.Dimensions,
|
||||
Statistics: reqParam.Parameters.Statistics,
|
||||
StartTime: aws.Time(time.Unix(reqParam.Parameters.StartTime, 0)),
|
||||
EndTime: aws.Time(time.Unix(reqParam.Parameters.EndTime, 0)),
|
||||
Period: aws.Int64(reqParam.Parameters.Period),
|
||||
Namespace: aws.String(reqParam.Parameters.Namespace),
|
||||
MetricName: aws.String(reqParam.Parameters.MetricName),
|
||||
Dimensions: reqParam.Parameters.Dimensions,
|
||||
Statistics: reqParam.Parameters.Statistics,
|
||||
ExtendedStatistics: reqParam.Parameters.ExtendedStatistics,
|
||||
StartTime: aws.Time(time.Unix(reqParam.Parameters.StartTime, 0)),
|
||||
EndTime: aws.Time(time.Unix(reqParam.Parameters.EndTime, 0)),
|
||||
Period: aws.Int64(reqParam.Parameters.Period),
|
||||
}
|
||||
if len(reqParam.Parameters.Statistics) != 0 {
|
||||
params.Statistics = reqParam.Parameters.Statistics
|
||||
}
|
||||
if len(reqParam.Parameters.ExtendedStatistics) != 0 {
|
||||
params.ExtendedStatistics = reqParam.Parameters.ExtendedStatistics
|
||||
}
|
||||
|
||||
resp, err := svc.GetMetricStatistics(params)
|
||||
@@ -254,11 +300,12 @@ func handleDescribeAlarmsForMetric(req *cwRequest, c *middleware.Context) {
|
||||
|
||||
reqParam := &struct {
|
||||
Parameters struct {
|
||||
Namespace string `json:"namespace"`
|
||||
MetricName string `json:"metricName"`
|
||||
Dimensions []*cloudwatch.Dimension `json:"dimensions"`
|
||||
Statistic string `json:"statistic"`
|
||||
Period int64 `json:"period"`
|
||||
Namespace string `json:"namespace"`
|
||||
MetricName string `json:"metricName"`
|
||||
Dimensions []*cloudwatch.Dimension `json:"dimensions"`
|
||||
Statistic string `json:"statistic"`
|
||||
ExtendedStatistic string `json:"extendedStatistic"`
|
||||
Period int64 `json:"period"`
|
||||
} `json:"parameters"`
|
||||
}{}
|
||||
json.Unmarshal(req.Body, reqParam)
|
||||
@@ -274,6 +321,9 @@ func handleDescribeAlarmsForMetric(req *cwRequest, c *middleware.Context) {
|
||||
if reqParam.Parameters.Statistic != "" {
|
||||
params.Statistic = aws.String(reqParam.Parameters.Statistic)
|
||||
}
|
||||
if reqParam.Parameters.ExtendedStatistic != "" {
|
||||
params.ExtendedStatistic = aws.String(reqParam.Parameters.ExtendedStatistic)
|
||||
}
|
||||
|
||||
resp, err := svc.DescribeAlarmsForMetric(params)
|
||||
if err != nil {
|
||||
|
||||
@@ -192,8 +192,10 @@ func handleGetMetrics(req *cwRequest, c *middleware.Context) {
|
||||
}
|
||||
} else {
|
||||
var err error
|
||||
assumeRoleArn := req.DataSource.JsonData.Get("assumeRoleArn").MustString()
|
||||
if namespaceMetrics, err = getMetricsForCustomMetrics(req.Region, reqParam.Parameters.Namespace, req.DataSource.Database, assumeRoleArn, getAllMetrics); err != nil {
|
||||
cwData := req.GetDatasourceInfo()
|
||||
cwData.Namespace = reqParam.Parameters.Namespace
|
||||
|
||||
if namespaceMetrics, err = getMetricsForCustomMetrics(cwData, getAllMetrics); err != nil {
|
||||
c.JsonApiErr(500, "Unable to call AWS API", err)
|
||||
return
|
||||
}
|
||||
@@ -226,8 +228,10 @@ func handleGetDimensions(req *cwRequest, c *middleware.Context) {
|
||||
}
|
||||
} else {
|
||||
var err error
|
||||
assumeRoleArn := req.DataSource.JsonData.Get("assumeRoleArn").MustString()
|
||||
if dimensionValues, err = getDimensionsForCustomMetrics(req.Region, reqParam.Parameters.Namespace, req.DataSource.Database, assumeRoleArn, getAllMetrics); err != nil {
|
||||
dsInfo := req.GetDatasourceInfo()
|
||||
dsInfo.Namespace = reqParam.Parameters.Namespace
|
||||
|
||||
if dimensionValues, err = getDimensionsForCustomMetrics(dsInfo, getAllMetrics); err != nil {
|
||||
c.JsonApiErr(500, "Unable to call AWS API", err)
|
||||
return
|
||||
}
|
||||
@@ -242,16 +246,16 @@ func handleGetDimensions(req *cwRequest, c *middleware.Context) {
|
||||
c.JSON(200, result)
|
||||
}
|
||||
|
||||
func getAllMetrics(region string, namespace string, database string, assumeRoleArn string) (cloudwatch.ListMetricsOutput, error) {
|
||||
func getAllMetrics(cwData *datasourceInfo) (cloudwatch.ListMetricsOutput, error) {
|
||||
cfg := &aws.Config{
|
||||
Region: aws.String(region),
|
||||
Credentials: getCredentials(database, region, assumeRoleArn),
|
||||
Region: aws.String(cwData.Region),
|
||||
Credentials: getCredentials(cwData),
|
||||
}
|
||||
|
||||
svc := cloudwatch.New(session.New(cfg), cfg)
|
||||
|
||||
params := &cloudwatch.ListMetricsInput{
|
||||
Namespace: aws.String(namespace),
|
||||
Namespace: aws.String(cwData.Namespace),
|
||||
}
|
||||
|
||||
var resp cloudwatch.ListMetricsOutput
|
||||
@@ -272,8 +276,8 @@ func getAllMetrics(region string, namespace string, database string, assumeRoleA
|
||||
|
||||
var metricsCacheLock sync.Mutex
|
||||
|
||||
func getMetricsForCustomMetrics(region string, namespace string, database string, assumeRoleArn string, getAllMetrics func(string, string, string, string) (cloudwatch.ListMetricsOutput, error)) ([]string, error) {
|
||||
result, err := getAllMetrics(region, namespace, database, assumeRoleArn)
|
||||
func getMetricsForCustomMetrics(dsInfo *datasourceInfo, getAllMetrics func(*datasourceInfo) (cloudwatch.ListMetricsOutput, error)) ([]string, error) {
|
||||
result, err := getAllMetrics(dsInfo)
|
||||
if err != nil {
|
||||
return []string{}, err
|
||||
}
|
||||
@@ -281,37 +285,37 @@ func getMetricsForCustomMetrics(region string, namespace string, database string
|
||||
metricsCacheLock.Lock()
|
||||
defer metricsCacheLock.Unlock()
|
||||
|
||||
if _, ok := customMetricsMetricsMap[database]; !ok {
|
||||
customMetricsMetricsMap[database] = make(map[string]map[string]*CustomMetricsCache)
|
||||
if _, ok := customMetricsMetricsMap[dsInfo.Profile]; !ok {
|
||||
customMetricsMetricsMap[dsInfo.Profile] = make(map[string]map[string]*CustomMetricsCache)
|
||||
}
|
||||
if _, ok := customMetricsMetricsMap[database][region]; !ok {
|
||||
customMetricsMetricsMap[database][region] = make(map[string]*CustomMetricsCache)
|
||||
if _, ok := customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region]; !ok {
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region] = make(map[string]*CustomMetricsCache)
|
||||
}
|
||||
if _, ok := customMetricsMetricsMap[database][region][namespace]; !ok {
|
||||
customMetricsMetricsMap[database][region][namespace] = &CustomMetricsCache{}
|
||||
customMetricsMetricsMap[database][region][namespace].Cache = make([]string, 0)
|
||||
if _, ok := customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace]; !ok {
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace] = &CustomMetricsCache{}
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = make([]string, 0)
|
||||
}
|
||||
|
||||
if customMetricsMetricsMap[database][region][namespace].Expire.After(time.Now()) {
|
||||
return customMetricsMetricsMap[database][region][namespace].Cache, nil
|
||||
if customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Expire.After(time.Now()) {
|
||||
return customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, nil
|
||||
}
|
||||
customMetricsMetricsMap[database][region][namespace].Cache = make([]string, 0)
|
||||
customMetricsMetricsMap[database][region][namespace].Expire = time.Now().Add(5 * time.Minute)
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = make([]string, 0)
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Expire = time.Now().Add(5 * time.Minute)
|
||||
|
||||
for _, metric := range result.Metrics {
|
||||
if isDuplicate(customMetricsMetricsMap[database][region][namespace].Cache, *metric.MetricName) {
|
||||
if isDuplicate(customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, *metric.MetricName) {
|
||||
continue
|
||||
}
|
||||
customMetricsMetricsMap[database][region][namespace].Cache = append(customMetricsMetricsMap[database][region][namespace].Cache, *metric.MetricName)
|
||||
customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = append(customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, *metric.MetricName)
|
||||
}
|
||||
|
||||
return customMetricsMetricsMap[database][region][namespace].Cache, nil
|
||||
return customMetricsMetricsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, nil
|
||||
}
|
||||
|
||||
var dimensionsCacheLock sync.Mutex
|
||||
|
||||
func getDimensionsForCustomMetrics(region string, namespace string, database string, assumeRoleArn string, getAllMetrics func(string, string, string, string) (cloudwatch.ListMetricsOutput, error)) ([]string, error) {
|
||||
result, err := getAllMetrics(region, namespace, database, assumeRoleArn)
|
||||
func getDimensionsForCustomMetrics(dsInfo *datasourceInfo, getAllMetrics func(*datasourceInfo) (cloudwatch.ListMetricsOutput, error)) ([]string, error) {
|
||||
result, err := getAllMetrics(dsInfo)
|
||||
if err != nil {
|
||||
return []string{}, err
|
||||
}
|
||||
@@ -319,33 +323,33 @@ func getDimensionsForCustomMetrics(region string, namespace string, database str
|
||||
dimensionsCacheLock.Lock()
|
||||
defer dimensionsCacheLock.Unlock()
|
||||
|
||||
if _, ok := customMetricsDimensionsMap[database]; !ok {
|
||||
customMetricsDimensionsMap[database] = make(map[string]map[string]*CustomMetricsCache)
|
||||
if _, ok := customMetricsDimensionsMap[dsInfo.Profile]; !ok {
|
||||
customMetricsDimensionsMap[dsInfo.Profile] = make(map[string]map[string]*CustomMetricsCache)
|
||||
}
|
||||
if _, ok := customMetricsDimensionsMap[database][region]; !ok {
|
||||
customMetricsDimensionsMap[database][region] = make(map[string]*CustomMetricsCache)
|
||||
if _, ok := customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region]; !ok {
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region] = make(map[string]*CustomMetricsCache)
|
||||
}
|
||||
if _, ok := customMetricsDimensionsMap[database][region][namespace]; !ok {
|
||||
customMetricsDimensionsMap[database][region][namespace] = &CustomMetricsCache{}
|
||||
customMetricsDimensionsMap[database][region][namespace].Cache = make([]string, 0)
|
||||
if _, ok := customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace]; !ok {
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace] = &CustomMetricsCache{}
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = make([]string, 0)
|
||||
}
|
||||
|
||||
if customMetricsDimensionsMap[database][region][namespace].Expire.After(time.Now()) {
|
||||
return customMetricsDimensionsMap[database][region][namespace].Cache, nil
|
||||
if customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Expire.After(time.Now()) {
|
||||
return customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, nil
|
||||
}
|
||||
customMetricsDimensionsMap[database][region][namespace].Cache = make([]string, 0)
|
||||
customMetricsDimensionsMap[database][region][namespace].Expire = time.Now().Add(5 * time.Minute)
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = make([]string, 0)
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Expire = time.Now().Add(5 * time.Minute)
|
||||
|
||||
for _, metric := range result.Metrics {
|
||||
for _, dimension := range metric.Dimensions {
|
||||
if isDuplicate(customMetricsDimensionsMap[database][region][namespace].Cache, *dimension.Name) {
|
||||
if isDuplicate(customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, *dimension.Name) {
|
||||
continue
|
||||
}
|
||||
customMetricsDimensionsMap[database][region][namespace].Cache = append(customMetricsDimensionsMap[database][region][namespace].Cache, *dimension.Name)
|
||||
customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache = append(customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, *dimension.Name)
|
||||
}
|
||||
}
|
||||
|
||||
return customMetricsDimensionsMap[database][region][namespace].Cache, nil
|
||||
return customMetricsDimensionsMap[dsInfo.Profile][dsInfo.Region][dsInfo.Namespace].Cache, nil
|
||||
}
|
||||
|
||||
func isDuplicate(nameList []string, target string) bool {
|
||||
|
||||
@@ -11,11 +11,13 @@ import (
|
||||
func TestCloudWatchMetrics(t *testing.T) {
|
||||
|
||||
Convey("When calling getMetricsForCustomMetrics", t, func() {
|
||||
region := "us-east-1"
|
||||
namespace := "Foo"
|
||||
database := "default"
|
||||
assumeRoleArn := ""
|
||||
f := func(region string, namespace string, database string, assumeRoleArn string) (cloudwatch.ListMetricsOutput, error) {
|
||||
dsInfo := &datasourceInfo{
|
||||
Region: "us-east-1",
|
||||
Namespace: "Foo",
|
||||
Profile: "default",
|
||||
AssumeRoleArn: "",
|
||||
}
|
||||
f := func(dsInfo *datasourceInfo) (cloudwatch.ListMetricsOutput, error) {
|
||||
return cloudwatch.ListMetricsOutput{
|
||||
Metrics: []*cloudwatch.Metric{
|
||||
{
|
||||
@@ -29,7 +31,7 @@ func TestCloudWatchMetrics(t *testing.T) {
|
||||
},
|
||||
}, nil
|
||||
}
|
||||
metrics, _ := getMetricsForCustomMetrics(region, namespace, database, assumeRoleArn, f)
|
||||
metrics, _ := getMetricsForCustomMetrics(dsInfo, f)
|
||||
|
||||
Convey("Should contain Test_MetricName", func() {
|
||||
So(metrics, ShouldContain, "Test_MetricName")
|
||||
@@ -37,11 +39,13 @@ func TestCloudWatchMetrics(t *testing.T) {
|
||||
})
|
||||
|
||||
Convey("When calling getDimensionsForCustomMetrics", t, func() {
|
||||
region := "us-east-1"
|
||||
namespace := "Foo"
|
||||
database := "default"
|
||||
assumeRoleArn := ""
|
||||
f := func(region string, namespace string, database string, assumeRoleArn string) (cloudwatch.ListMetricsOutput, error) {
|
||||
dsInfo := &datasourceInfo{
|
||||
Region: "us-east-1",
|
||||
Namespace: "Foo",
|
||||
Profile: "default",
|
||||
AssumeRoleArn: "",
|
||||
}
|
||||
f := func(dsInfo *datasourceInfo) (cloudwatch.ListMetricsOutput, error) {
|
||||
return cloudwatch.ListMetricsOutput{
|
||||
Metrics: []*cloudwatch.Metric{
|
||||
{
|
||||
@@ -55,7 +59,7 @@ func TestCloudWatchMetrics(t *testing.T) {
|
||||
},
|
||||
}, nil
|
||||
}
|
||||
dimensionKeys, _ := getDimensionsForCustomMetrics(region, namespace, database, assumeRoleArn, f)
|
||||
dimensionKeys, _ := getDimensionsForCustomMetrics(dsInfo, f)
|
||||
|
||||
Convey("Should contain Test_DimensionName", func() {
|
||||
So(dimensionKeys, ShouldContain, "Test_DimensionName")
|
||||
|
||||
@@ -121,6 +121,10 @@ func PostDashboard(c *middleware.Context, cmd m.SaveDashboardCommand) Response {
|
||||
}
|
||||
|
||||
dash := cmd.GetDashboardModel()
|
||||
// Check if Title is empty
|
||||
if dash.Title == "" {
|
||||
return ApiError(400, m.ErrDashboardTitleEmpty.Error(), nil)
|
||||
}
|
||||
if dash.Id == 0 {
|
||||
limitReached, err := middleware.QuotaReached(c, "dashboard")
|
||||
if err != nil {
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"crypto/tls"
|
||||
"net"
|
||||
"net/http"
|
||||
"net/http/httputil"
|
||||
"net/url"
|
||||
@@ -17,16 +15,6 @@ import (
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
var dataProxyTransport = &http.Transport{
|
||||
TLSClientConfig: &tls.Config{InsecureSkipVerify: true},
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
Dial: (&net.Dialer{
|
||||
Timeout: 30 * time.Second,
|
||||
KeepAlive: 30 * time.Second,
|
||||
}).Dial,
|
||||
TLSHandshakeTimeout: 10 * time.Second,
|
||||
}
|
||||
|
||||
func NewReverseProxy(ds *m.DataSource, proxyPath string, targetUrl *url.URL) *httputil.ReverseProxy {
|
||||
director := func(req *http.Request) {
|
||||
req.URL.Scheme = targetUrl.Scheme
|
||||
@@ -128,7 +116,11 @@ func ProxyDataSourceRequest(c *middleware.Context) {
|
||||
}
|
||||
|
||||
proxy := NewReverseProxy(ds, proxyPath, targetUrl)
|
||||
proxy.Transport = dataProxyTransport
|
||||
proxy.Transport, err = ds.GetHttpTransport()
|
||||
if err != nil {
|
||||
c.JsonApiErr(400, "Unable to load TLS certificate", err)
|
||||
return
|
||||
}
|
||||
proxy.ServeHTTP(c.Resp, c.Req.Request)
|
||||
c.Resp.Header().Del("Set-Cookie")
|
||||
}
|
||||
|
||||
@@ -11,11 +11,16 @@ import (
|
||||
)
|
||||
|
||||
func TestDataSourceProxy(t *testing.T) {
|
||||
|
||||
Convey("When getting graphite datasource proxy", t, func() {
|
||||
ds := m.DataSource{Url: "htttp://graphite:8080", Type: m.DS_GRAPHITE}
|
||||
targetUrl, _ := url.Parse(ds.Url)
|
||||
targetUrl, err := url.Parse(ds.Url)
|
||||
proxy := NewReverseProxy(&ds, "/render", targetUrl)
|
||||
proxy.Transport, err = ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
transport, ok := proxy.Transport.(*http.Transport)
|
||||
So(ok, ShouldBeTrue)
|
||||
So(transport.TLSClientConfig.InsecureSkipVerify, ShouldBeTrue)
|
||||
|
||||
requestUrl, _ := url.Parse("http://grafana.com/sub")
|
||||
req := http.Request{URL: requestUrl}
|
||||
@@ -54,7 +59,5 @@ func TestDataSourceProxy(t *testing.T) {
|
||||
So(queryVals["u"][0], ShouldEqual, "user")
|
||||
So(queryVals["p"][0], ShouldEqual, "password")
|
||||
})
|
||||
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
@@ -5,10 +5,9 @@ import (
|
||||
|
||||
"github.com/grafana/grafana/pkg/api/dtos"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/plugins"
|
||||
//"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/middleware"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/plugins"
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
@@ -104,17 +103,56 @@ func AddDataSource(c *middleware.Context, cmd m.AddDataSourceCommand) {
|
||||
c.JSON(200, util.DynMap{"message": "Datasource added", "id": cmd.Result.Id})
|
||||
}
|
||||
|
||||
func UpdateDataSource(c *middleware.Context, cmd m.UpdateDataSourceCommand) {
|
||||
func UpdateDataSource(c *middleware.Context, cmd m.UpdateDataSourceCommand) Response {
|
||||
cmd.OrgId = c.OrgId
|
||||
cmd.Id = c.ParamsInt64(":id")
|
||||
|
||||
err := bus.Dispatch(&cmd)
|
||||
err := fillWithSecureJsonData(&cmd)
|
||||
if err != nil {
|
||||
c.JsonApiErr(500, "Failed to update datasource", err)
|
||||
return
|
||||
return ApiError(500, "Failed to update datasource", err)
|
||||
}
|
||||
|
||||
c.JsonOK("Datasource updated")
|
||||
err = bus.Dispatch(&cmd)
|
||||
if err != nil {
|
||||
return ApiError(500, "Failed to update datasource", err)
|
||||
}
|
||||
|
||||
return Json(200, "Datasource updated")
|
||||
}
|
||||
|
||||
func fillWithSecureJsonData(cmd *m.UpdateDataSourceCommand) error {
|
||||
if len(cmd.SecureJsonData) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
ds, err := getRawDataSourceById(cmd.Id, cmd.OrgId)
|
||||
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
secureJsonData := ds.SecureJsonData.Decrypt()
|
||||
|
||||
for k, v := range secureJsonData {
|
||||
|
||||
if _, ok := cmd.SecureJsonData[k]; !ok {
|
||||
cmd.SecureJsonData[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func getRawDataSourceById(id int64, orgId int64) (*m.DataSource, error) {
|
||||
query := m.GetDataSourceByIdQuery{
|
||||
Id: id,
|
||||
OrgId: orgId,
|
||||
}
|
||||
|
||||
if err := bus.Dispatch(&query); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return query.Result, nil
|
||||
}
|
||||
|
||||
// Get /api/datasources/name/:name
|
||||
@@ -152,7 +190,7 @@ func GetDataSourceIdByName(c *middleware.Context) Response {
|
||||
}
|
||||
|
||||
func convertModelToDtos(ds *m.DataSource) dtos.DataSource {
|
||||
return dtos.DataSource{
|
||||
dto := dtos.DataSource{
|
||||
Id: ds.Id,
|
||||
OrgId: ds.OrgId,
|
||||
Name: ds.Name,
|
||||
@@ -169,4 +207,18 @@ func convertModelToDtos(ds *m.DataSource) dtos.DataSource {
|
||||
IsDefault: ds.IsDefault,
|
||||
JsonData: ds.JsonData,
|
||||
}
|
||||
|
||||
if len(ds.SecureJsonData) > 0 {
|
||||
dto.TLSAuth.CACertSet = len(ds.SecureJsonData["tlsCACert"]) > 0
|
||||
dto.TLSAuth.ClientCertSet = len(ds.SecureJsonData["tlsClientCert"]) > 0
|
||||
dto.TLSAuth.ClientKeySet = len(ds.SecureJsonData["tlsClientKey"]) > 0
|
||||
}
|
||||
|
||||
for k, v := range ds.SecureJsonData {
|
||||
if len(v) > 0 {
|
||||
dto.EncryptedFields = append(dto.EncryptedFields, k)
|
||||
}
|
||||
}
|
||||
|
||||
return dto
|
||||
}
|
||||
|
||||
@@ -81,6 +81,15 @@ type DataSource struct {
|
||||
WithCredentials bool `json:"withCredentials"`
|
||||
IsDefault bool `json:"isDefault"`
|
||||
JsonData *simplejson.Json `json:"jsonData,omitempty"`
|
||||
TLSAuth TLSAuth `json:"tlsAuth,omitempty"`
|
||||
EncryptedFields []string `json:"encryptedFields"`
|
||||
}
|
||||
|
||||
// TLSAuth is used to show if TLS certs have been uploaded already
|
||||
type TLSAuth struct {
|
||||
CACertSet bool `json:"tlsCACertSet"`
|
||||
ClientCertSet bool `json:"tlsClientCertSet"`
|
||||
ClientKeySet bool `json:"tlsClientKeySet"`
|
||||
}
|
||||
|
||||
type DataSourceList []DataSource
|
||||
|
||||
@@ -8,6 +8,7 @@ import (
|
||||
"github.com/grafana/grafana/pkg/api/dtos"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
"github.com/grafana/grafana/pkg/middleware"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
"github.com/grafana/grafana/pkg/tsdb/testdata"
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
@@ -25,9 +26,9 @@ func QueryMetrics(c *middleware.Context, reqDto dtos.MetricRequest) Response {
|
||||
MaxDataPoints: query.Get("maxDataPoints").MustInt64(100),
|
||||
IntervalMs: query.Get("intervalMs").MustInt64(1000),
|
||||
Model: query,
|
||||
DataSource: &tsdb.DataSourceInfo{
|
||||
Name: "Grafana TestDataDB",
|
||||
PluginId: "grafana-testdata-datasource",
|
||||
DataSource: &models.DataSource{
|
||||
Name: "Grafana TestDataDB",
|
||||
Type: "grafana-testdata-datasource",
|
||||
},
|
||||
})
|
||||
}
|
||||
@@ -69,6 +70,10 @@ func GetInternalMetrics(c *middleware.Context) Response {
|
||||
metricName := m.Name() + m.StringifyTags()
|
||||
|
||||
switch metric := m.(type) {
|
||||
case metrics.Gauge:
|
||||
resp[metricName] = map[string]interface{}{
|
||||
"value": metric.Value(),
|
||||
}
|
||||
case metrics.Counter:
|
||||
resp[metricName] = map[string]interface{}{
|
||||
"count": metric.Count(),
|
||||
|
||||
@@ -152,6 +152,9 @@ func updateOrgAddressHelper(form dtos.UpdateOrgAddressForm, orgId int64) Respons
|
||||
// GET /api/orgs/:orgId
|
||||
func DeleteOrgById(c *middleware.Context) Response {
|
||||
if err := bus.Dispatch(&m.DeleteOrgCommand{Id: c.ParamsInt64(":orgId")}); err != nil {
|
||||
if err == m.ErrOrgNotFound {
|
||||
return ApiError(404, "Failed to delete organization. ID not found", nil)
|
||||
}
|
||||
return ApiError(500, "Failed to update organization", err)
|
||||
}
|
||||
return ApiSuccess("Organization deleted")
|
||||
|
||||
@@ -26,7 +26,7 @@ import (
|
||||
_ "github.com/grafana/grafana/pkg/tsdb/testdata"
|
||||
)
|
||||
|
||||
var version = "3.1.0"
|
||||
var version = "4.0.0"
|
||||
var commit = "NA"
|
||||
var buildstamp string
|
||||
var build_date string
|
||||
|
||||
@@ -46,14 +46,15 @@ func newMacaron() *macaron.Macaron {
|
||||
Delims: macaron.Delims{Left: "[[", Right: "]]"},
|
||||
}))
|
||||
|
||||
if setting.EnforceDomain {
|
||||
m.Use(middleware.ValidateHostHeader(setting.Domain))
|
||||
}
|
||||
|
||||
m.Use(middleware.GetContextHandler())
|
||||
m.Use(middleware.Sessioner(&setting.SessionOptions))
|
||||
m.Use(middleware.RequestMetrics())
|
||||
|
||||
// needs to be after context handler
|
||||
if setting.EnforceDomain {
|
||||
m.Use(middleware.ValidateHostHeader(setting.Domain))
|
||||
}
|
||||
|
||||
return m
|
||||
}
|
||||
|
||||
|
||||
@@ -7,7 +7,6 @@ import (
|
||||
"net/http"
|
||||
"net/url"
|
||||
"path"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
@@ -19,14 +18,17 @@ type WebdavUploader struct {
|
||||
}
|
||||
|
||||
func (u *WebdavUploader) Upload(pa string) (string, error) {
|
||||
client := http.Client{Timeout: time.Duration(10 * time.Second)}
|
||||
|
||||
url, _ := url.Parse(u.url)
|
||||
url.Path = path.Join(url.Path, util.GetRandomString(20)+".png")
|
||||
|
||||
imgData, err := ioutil.ReadFile(pa)
|
||||
req, err := http.NewRequest("PUT", url.String(), bytes.NewReader(imgData))
|
||||
res, err := client.Do(req)
|
||||
|
||||
if u.username != "" {
|
||||
req.SetBasicAuth(u.username, u.password)
|
||||
}
|
||||
|
||||
res, err := http.DefaultClient.Do(req)
|
||||
|
||||
if err != nil {
|
||||
return "", err
|
||||
|
||||
@@ -35,12 +35,12 @@ func RenderToPng(params *RenderOpts) (string, error) {
|
||||
executable = executable + ".exe"
|
||||
}
|
||||
|
||||
localAddress := "localhost"
|
||||
localDomain := "localhost"
|
||||
if setting.HttpAddr != setting.DEFAULT_HTTP_ADDR {
|
||||
localAddress = setting.HttpAddr
|
||||
localDomain = setting.HttpAddr
|
||||
}
|
||||
|
||||
url := fmt.Sprintf("%s://%s:%s/%s", setting.Protocol, localAddress, setting.HttpPort, params.Path)
|
||||
url := fmt.Sprintf("%s://%s:%s/%s", setting.Protocol, localDomain, setting.HttpPort, params.Path)
|
||||
|
||||
binPath, _ := filepath.Abs(filepath.Join(setting.PhantomDir, executable))
|
||||
scriptPath, _ := filepath.Abs(filepath.Join(setting.PhantomDir, "render.js"))
|
||||
@@ -52,12 +52,13 @@ func RenderToPng(params *RenderOpts) (string, error) {
|
||||
|
||||
cmdArgs := []string{
|
||||
"--ignore-ssl-errors=true",
|
||||
"--web-security=false",
|
||||
scriptPath,
|
||||
"url=" + url,
|
||||
"width=" + params.Width,
|
||||
"height=" + params.Height,
|
||||
"png=" + pngPath,
|
||||
"domain=" + setting.Domain,
|
||||
"domain=" + localDomain,
|
||||
"renderKey=" + renderKey,
|
||||
}
|
||||
|
||||
|
||||
24
pkg/components/securejsondata/securejsondata.go
Normal file
24
pkg/components/securejsondata/securejsondata.go
Normal file
@@ -0,0 +1,24 @@
|
||||
package securejsondata
|
||||
|
||||
import (
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
type SecureJsonData map[string][]byte
|
||||
|
||||
func (s SecureJsonData) Decrypt() map[string]string {
|
||||
decrypted := make(map[string]string)
|
||||
for key, data := range s {
|
||||
decrypted[key] = string(util.Decrypt(data, setting.SecretKey))
|
||||
}
|
||||
return decrypted
|
||||
}
|
||||
|
||||
func GetEncryptedJsonData(sjd map[string]string) SecureJsonData {
|
||||
encrypted := make(SecureJsonData)
|
||||
for key, data := range sjd {
|
||||
encrypted[key] = util.Encrypt([]byte(data), setting.SecretKey)
|
||||
}
|
||||
return encrypted
|
||||
}
|
||||
@@ -32,8 +32,8 @@ func RegGauge(name string, tagStrings ...string) Gauge {
|
||||
|
||||
// GaugeSnapshot is a read-only copy of another Gauge.
|
||||
type GaugeSnapshot struct {
|
||||
*MetricMeta
|
||||
value int64
|
||||
*MetricMeta
|
||||
}
|
||||
|
||||
// Snapshot returns the snapshot.
|
||||
@@ -61,9 +61,10 @@ func (NilGauge) Value() int64 { return 0 }
|
||||
|
||||
// StandardGauge is the standard implementation of a Gauge and uses the
|
||||
// sync/atomic package to manage a single int64 value.
|
||||
// atomic needs 64-bit aligned memory which is ensure for first word
|
||||
type StandardGauge struct {
|
||||
*MetricMeta
|
||||
value int64
|
||||
*MetricMeta
|
||||
}
|
||||
|
||||
// Snapshot returns a read-only copy of the gauge.
|
||||
|
||||
@@ -9,54 +9,55 @@ func init() {
|
||||
}
|
||||
|
||||
var (
|
||||
M_Instance_Start Counter
|
||||
M_Page_Status_200 Counter
|
||||
M_Page_Status_500 Counter
|
||||
M_Page_Status_404 Counter
|
||||
M_Page_Status_Unknown Counter
|
||||
M_Api_Status_200 Counter
|
||||
M_Api_Status_404 Counter
|
||||
M_Api_Status_500 Counter
|
||||
M_Api_Status_Unknown Counter
|
||||
M_Proxy_Status_200 Counter
|
||||
M_Proxy_Status_404 Counter
|
||||
M_Proxy_Status_500 Counter
|
||||
M_Proxy_Status_Unknown Counter
|
||||
M_Api_User_SignUpStarted Counter
|
||||
M_Api_User_SignUpCompleted Counter
|
||||
M_Api_User_SignUpInvite Counter
|
||||
M_Api_Dashboard_Save Timer
|
||||
M_Api_Dashboard_Get Timer
|
||||
M_Api_Dashboard_Search Timer
|
||||
M_Api_Admin_User_Create Counter
|
||||
M_Api_Login_Post Counter
|
||||
M_Api_Login_OAuth Counter
|
||||
M_Api_Org_Create Counter
|
||||
M_Api_Dashboard_Snapshot_Create Counter
|
||||
M_Api_Dashboard_Snapshot_External Counter
|
||||
M_Api_Dashboard_Snapshot_Get Counter
|
||||
M_Models_Dashboard_Insert Counter
|
||||
M_Alerting_Result_State_Alerting Counter
|
||||
M_Alerting_Result_State_Ok Counter
|
||||
M_Alerting_Result_State_Paused Counter
|
||||
M_Alerting_Result_State_NoData Counter
|
||||
M_Alerting_Result_State_Pending Counter
|
||||
M_Alerting_Active_Alerts Counter
|
||||
M_Alerting_Notification_Sent_Slack Counter
|
||||
M_Alerting_Notification_Sent_Email Counter
|
||||
M_Alerting_Notification_Sent_Webhook Counter
|
||||
M_Alerting_Notification_Sent_PagerDuty Counter
|
||||
|
||||
M_Instance_Start Counter
|
||||
M_Page_Status_200 Counter
|
||||
M_Page_Status_500 Counter
|
||||
M_Page_Status_404 Counter
|
||||
M_Page_Status_Unknown Counter
|
||||
M_Api_Status_200 Counter
|
||||
M_Api_Status_404 Counter
|
||||
M_Api_Status_500 Counter
|
||||
M_Api_Status_Unknown Counter
|
||||
M_Proxy_Status_200 Counter
|
||||
M_Proxy_Status_404 Counter
|
||||
M_Proxy_Status_500 Counter
|
||||
M_Proxy_Status_Unknown Counter
|
||||
M_Api_User_SignUpStarted Counter
|
||||
M_Api_User_SignUpCompleted Counter
|
||||
M_Api_User_SignUpInvite Counter
|
||||
M_Api_Dashboard_Save Timer
|
||||
M_Api_Dashboard_Get Timer
|
||||
M_Api_Dashboard_Search Timer
|
||||
M_Api_Admin_User_Create Counter
|
||||
M_Api_Login_Post Counter
|
||||
M_Api_Login_OAuth Counter
|
||||
M_Api_Org_Create Counter
|
||||
M_Api_Dashboard_Snapshot_Create Counter
|
||||
M_Api_Dashboard_Snapshot_External Counter
|
||||
M_Api_Dashboard_Snapshot_Get Counter
|
||||
M_Models_Dashboard_Insert Counter
|
||||
M_Alerting_Result_State_Alerting Counter
|
||||
M_Alerting_Result_State_Ok Counter
|
||||
M_Alerting_Result_State_Paused Counter
|
||||
M_Alerting_Result_State_NoData Counter
|
||||
M_Alerting_Result_State_Pending Counter
|
||||
M_Alerting_Notification_Sent_Slack Counter
|
||||
M_Alerting_Notification_Sent_Email Counter
|
||||
M_Alerting_Notification_Sent_Webhook Counter
|
||||
M_Alerting_Notification_Sent_PagerDuty Counter
|
||||
M_Alerting_Notification_Sent_Victorops Counter
|
||||
M_Alerting_Notification_Sent_OpsGenie Counter
|
||||
|
||||
// Timers
|
||||
M_DataSource_ProxyReq_Timer Timer
|
||||
M_Alerting_Exeuction_Time Timer
|
||||
|
||||
// StatTotals
|
||||
M_StatTotal_Dashboards Gauge
|
||||
M_StatTotal_Users Gauge
|
||||
M_StatTotal_Orgs Gauge
|
||||
M_StatTotal_Playlists Gauge
|
||||
M_Alerting_Active_Alerts Gauge
|
||||
M_StatTotal_Dashboards Gauge
|
||||
M_StatTotal_Users Gauge
|
||||
M_StatTotal_Orgs Gauge
|
||||
M_StatTotal_Playlists Gauge
|
||||
)
|
||||
|
||||
func initMetricVars(settings *MetricSettings) {
|
||||
@@ -105,17 +106,19 @@ func initMetricVars(settings *MetricSettings) {
|
||||
M_Alerting_Result_State_NoData = RegCounter("alerting.result", "state", "no_data")
|
||||
M_Alerting_Result_State_Pending = RegCounter("alerting.result", "state", "pending")
|
||||
|
||||
M_Alerting_Active_Alerts = RegCounter("alerting.active_alerts")
|
||||
M_Alerting_Notification_Sent_Slack = RegCounter("alerting.notifications_sent", "type", "slack")
|
||||
M_Alerting_Notification_Sent_Email = RegCounter("alerting.notifications_sent", "type", "email")
|
||||
M_Alerting_Notification_Sent_Webhook = RegCounter("alerting.notifications_sent", "type", "webhook")
|
||||
M_Alerting_Notification_Sent_PagerDuty = RegCounter("alerting.notifications_sent", "type", "pagerduty")
|
||||
M_Alerting_Notification_Sent_Victorops = RegCounter("alerting.notifications_sent", "type", "victorops")
|
||||
M_Alerting_Notification_Sent_OpsGenie = RegCounter("alerting.notifications_sent", "type", "opsgenie")
|
||||
|
||||
// Timers
|
||||
M_DataSource_ProxyReq_Timer = RegTimer("api.dataproxy.request.all")
|
||||
M_Alerting_Exeuction_Time = RegTimer("alerting.execution_time")
|
||||
|
||||
// StatTotals
|
||||
M_Alerting_Active_Alerts = RegGauge("alerting.active_alerts")
|
||||
M_StatTotal_Dashboards = RegGauge("stat_totals", "stat", "dashboards")
|
||||
M_StatTotal_Users = RegGauge("stat_totals", "stat", "users")
|
||||
M_StatTotal_Orgs = RegGauge("stat_totals", "stat", "orgs")
|
||||
|
||||
@@ -8,7 +8,12 @@ import (
|
||||
)
|
||||
|
||||
func ValidateHostHeader(domain string) macaron.Handler {
|
||||
return func(c *macaron.Context) {
|
||||
return func(c *Context) {
|
||||
// ignore local render calls
|
||||
if c.IsRenderCall {
|
||||
return
|
||||
}
|
||||
|
||||
h := c.Req.Host
|
||||
if i := strings.Index(h, ":"); i >= 0 {
|
||||
h = h[:i]
|
||||
|
||||
@@ -15,6 +15,7 @@ var (
|
||||
ErrDashboardSnapshotNotFound = errors.New("Dashboard snapshot not found")
|
||||
ErrDashboardWithSameNameExists = errors.New("A dashboard with the same name already exists")
|
||||
ErrDashboardVersionMismatch = errors.New("The dashboard has been changed by someone else")
|
||||
ErrDashboardTitleEmpty = errors.New("Dashboard title cannot be empty")
|
||||
)
|
||||
|
||||
type UpdatePluginDashboardError struct {
|
||||
|
||||
@@ -4,6 +4,7 @@ import (
|
||||
"errors"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/securejsondata"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
)
|
||||
|
||||
@@ -46,6 +47,7 @@ type DataSource struct {
|
||||
WithCredentials bool
|
||||
IsDefault bool
|
||||
JsonData *simplejson.Json
|
||||
SecureJsonData securejsondata.SecureJsonData
|
||||
|
||||
Created time.Time
|
||||
Updated time.Time
|
||||
@@ -77,19 +79,20 @@ func IsKnownDataSourcePlugin(dsType string) bool {
|
||||
|
||||
// Also acts as api DTO
|
||||
type AddDataSourceCommand struct {
|
||||
Name string `json:"name" binding:"Required"`
|
||||
Type string `json:"type" binding:"Required"`
|
||||
Access DsAccess `json:"access" binding:"Required"`
|
||||
Url string `json:"url"`
|
||||
Password string `json:"password"`
|
||||
Database string `json:"database"`
|
||||
User string `json:"user"`
|
||||
BasicAuth bool `json:"basicAuth"`
|
||||
BasicAuthUser string `json:"basicAuthUser"`
|
||||
BasicAuthPassword string `json:"basicAuthPassword"`
|
||||
WithCredentials bool `json:"withCredentials"`
|
||||
IsDefault bool `json:"isDefault"`
|
||||
JsonData *simplejson.Json `json:"jsonData"`
|
||||
Name string `json:"name" binding:"Required"`
|
||||
Type string `json:"type" binding:"Required"`
|
||||
Access DsAccess `json:"access" binding:"Required"`
|
||||
Url string `json:"url"`
|
||||
Password string `json:"password"`
|
||||
Database string `json:"database"`
|
||||
User string `json:"user"`
|
||||
BasicAuth bool `json:"basicAuth"`
|
||||
BasicAuthUser string `json:"basicAuthUser"`
|
||||
BasicAuthPassword string `json:"basicAuthPassword"`
|
||||
WithCredentials bool `json:"withCredentials"`
|
||||
IsDefault bool `json:"isDefault"`
|
||||
JsonData *simplejson.Json `json:"jsonData"`
|
||||
SecureJsonData map[string]string `json:"secureJsonData"`
|
||||
|
||||
OrgId int64 `json:"-"`
|
||||
|
||||
@@ -98,19 +101,20 @@ type AddDataSourceCommand struct {
|
||||
|
||||
// Also acts as api DTO
|
||||
type UpdateDataSourceCommand struct {
|
||||
Name string `json:"name" binding:"Required"`
|
||||
Type string `json:"type" binding:"Required"`
|
||||
Access DsAccess `json:"access" binding:"Required"`
|
||||
Url string `json:"url"`
|
||||
Password string `json:"password"`
|
||||
User string `json:"user"`
|
||||
Database string `json:"database"`
|
||||
BasicAuth bool `json:"basicAuth"`
|
||||
BasicAuthUser string `json:"basicAuthUser"`
|
||||
BasicAuthPassword string `json:"basicAuthPassword"`
|
||||
WithCredentials bool `json:"withCredentials"`
|
||||
IsDefault bool `json:"isDefault"`
|
||||
JsonData *simplejson.Json `json:"jsonData"`
|
||||
Name string `json:"name" binding:"Required"`
|
||||
Type string `json:"type" binding:"Required"`
|
||||
Access DsAccess `json:"access" binding:"Required"`
|
||||
Url string `json:"url"`
|
||||
Password string `json:"password"`
|
||||
User string `json:"user"`
|
||||
Database string `json:"database"`
|
||||
BasicAuth bool `json:"basicAuth"`
|
||||
BasicAuthUser string `json:"basicAuthUser"`
|
||||
BasicAuthPassword string `json:"basicAuthPassword"`
|
||||
WithCredentials bool `json:"withCredentials"`
|
||||
IsDefault bool `json:"isDefault"`
|
||||
JsonData *simplejson.Json `json:"jsonData"`
|
||||
SecureJsonData map[string]string `json:"secureJsonData"`
|
||||
|
||||
OrgId int64 `json:"-"`
|
||||
Id int64 `json:"-"`
|
||||
|
||||
95
pkg/models/datasource_cache.go
Normal file
95
pkg/models/datasource_cache.go
Normal file
@@ -0,0 +1,95 @@
|
||||
package models
|
||||
|
||||
import (
|
||||
"crypto/tls"
|
||||
"crypto/x509"
|
||||
"net"
|
||||
"net/http"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
type proxyTransportCache struct {
|
||||
cache map[int64]cachedTransport
|
||||
sync.Mutex
|
||||
}
|
||||
|
||||
type cachedTransport struct {
|
||||
updated time.Time
|
||||
|
||||
*http.Transport
|
||||
}
|
||||
|
||||
var ptc = proxyTransportCache{
|
||||
cache: make(map[int64]cachedTransport),
|
||||
}
|
||||
|
||||
func (ds *DataSource) GetHttpClient() (*http.Client, error) {
|
||||
transport, err := ds.GetHttpTransport()
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &http.Client{
|
||||
Timeout: time.Duration(30 * time.Second),
|
||||
Transport: transport,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (ds *DataSource) GetHttpTransport() (*http.Transport, error) {
|
||||
ptc.Lock()
|
||||
defer ptc.Unlock()
|
||||
|
||||
if t, present := ptc.cache[ds.Id]; present && ds.Updated.Equal(t.updated) {
|
||||
return t.Transport, nil
|
||||
}
|
||||
|
||||
transport := &http.Transport{
|
||||
TLSClientConfig: &tls.Config{
|
||||
InsecureSkipVerify: true,
|
||||
},
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
Dial: (&net.Dialer{
|
||||
Timeout: 30 * time.Second,
|
||||
KeepAlive: 30 * time.Second,
|
||||
}).Dial,
|
||||
TLSHandshakeTimeout: 10 * time.Second,
|
||||
ExpectContinueTimeout: 1 * time.Second,
|
||||
MaxIdleConns: 100,
|
||||
IdleConnTimeout: 90 * time.Second,
|
||||
}
|
||||
|
||||
var tlsAuth, tlsAuthWithCACert bool
|
||||
if ds.JsonData != nil {
|
||||
tlsAuth = ds.JsonData.Get("tlsAuth").MustBool(false)
|
||||
tlsAuthWithCACert = ds.JsonData.Get("tlsAuthWithCACert").MustBool(false)
|
||||
}
|
||||
|
||||
if tlsAuth {
|
||||
transport.TLSClientConfig.InsecureSkipVerify = false
|
||||
|
||||
decrypted := ds.SecureJsonData.Decrypt()
|
||||
|
||||
if tlsAuthWithCACert && len(decrypted["tlsCACert"]) > 0 {
|
||||
caPool := x509.NewCertPool()
|
||||
ok := caPool.AppendCertsFromPEM([]byte(decrypted["tlsCACert"]))
|
||||
if ok {
|
||||
transport.TLSClientConfig.RootCAs = caPool
|
||||
}
|
||||
}
|
||||
|
||||
cert, err := tls.X509KeyPair([]byte(decrypted["tlsClientCert"]), []byte(decrypted["tlsClientKey"]))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
transport.TLSClientConfig.Certificates = []tls.Certificate{cert}
|
||||
}
|
||||
|
||||
ptc.cache[ds.Id] = cachedTransport{
|
||||
Transport: transport,
|
||||
updated: ds.Updated,
|
||||
}
|
||||
|
||||
return transport, nil
|
||||
}
|
||||
157
pkg/models/datasource_cache_test.go
Normal file
157
pkg/models/datasource_cache_test.go
Normal file
@@ -0,0 +1,157 @@
|
||||
package models
|
||||
|
||||
import (
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
func TestDataSourceCache(t *testing.T) {
|
||||
Convey("When caching a datasource proxy", t, func() {
|
||||
clearCache()
|
||||
ds := DataSource{
|
||||
Id: 1,
|
||||
Url: "http://k8s:8001",
|
||||
Type: "Kubernetes",
|
||||
}
|
||||
|
||||
t1, err := ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
t2, err := ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
Convey("Should be using the cached proxy", func() {
|
||||
So(t2, ShouldEqual, t1)
|
||||
})
|
||||
})
|
||||
|
||||
Convey("When getting kubernetes datasource proxy", t, func() {
|
||||
clearCache()
|
||||
setting.SecretKey = "password"
|
||||
|
||||
json := simplejson.New()
|
||||
json.Set("tlsAuth", true)
|
||||
json.Set("tlsAuthWithCACert", true)
|
||||
|
||||
t := time.Now()
|
||||
ds := DataSource{
|
||||
Url: "http://k8s:8001",
|
||||
Type: "Kubernetes",
|
||||
Updated: t.Add(-2 * time.Minute),
|
||||
}
|
||||
|
||||
transport, err := ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
Convey("Should have no cert", func() {
|
||||
So(transport.TLSClientConfig.InsecureSkipVerify, ShouldEqual, true)
|
||||
})
|
||||
|
||||
ds.JsonData = json
|
||||
ds.SecureJsonData = map[string][]byte{
|
||||
"tlsCACert": util.Encrypt([]byte(caCert), "password"),
|
||||
"tlsClientCert": util.Encrypt([]byte(clientCert), "password"),
|
||||
"tlsClientKey": util.Encrypt([]byte(clientKey), "password"),
|
||||
}
|
||||
ds.Updated = t.Add(-1 * time.Minute)
|
||||
|
||||
transport, err = ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
Convey("Should add cert", func() {
|
||||
So(transport.TLSClientConfig.InsecureSkipVerify, ShouldEqual, false)
|
||||
So(len(transport.TLSClientConfig.Certificates), ShouldEqual, 1)
|
||||
})
|
||||
|
||||
ds.JsonData = nil
|
||||
ds.SecureJsonData = map[string][]byte{}
|
||||
ds.Updated = t
|
||||
|
||||
transport, err = ds.GetHttpTransport()
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
Convey("Should remove cert", func() {
|
||||
So(transport.TLSClientConfig.InsecureSkipVerify, ShouldEqual, true)
|
||||
So(len(transport.TLSClientConfig.Certificates), ShouldEqual, 0)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func clearCache() {
|
||||
ptc.Lock()
|
||||
defer ptc.Unlock()
|
||||
|
||||
ptc.cache = make(map[int64]cachedTransport)
|
||||
}
|
||||
|
||||
const caCert string = `-----BEGIN CERTIFICATE-----
|
||||
MIIDATCCAemgAwIBAgIJAMQ5hC3CPDTeMA0GCSqGSIb3DQEBCwUAMBcxFTATBgNV
|
||||
BAMMDGNhLWs4cy1zdGhsbTAeFw0xNjEwMjcwODQyMjdaFw00NDAzMTQwODQyMjda
|
||||
MBcxFTATBgNVBAMMDGNhLWs4cy1zdGhsbTCCASIwDQYJKoZIhvcNAQEBBQADggEP
|
||||
ADCCAQoCggEBAMLe2AmJ6IleeUt69vgNchOjjmxIIxz5sp1vFu94m1vUip7CqnOg
|
||||
QkpUsHeBPrGYv8UGloARCL1xEWS+9FVZeXWQoDmbC0SxXhFwRIESNCET7Q8KMi/4
|
||||
4YPvnMLGZi3Fjwxa8BdUBCN1cx4WEooMVTWXm7RFMtZgDfuOAn3TNXla732sfT/d
|
||||
1HNFrh48b0wA+HhmA3nXoBnBEblA665hCeo7lIAdRr0zJxJpnFnWXkyTClsAUTMN
|
||||
iL905LdBiiIRenojipfKXvMz88XSaWTI7JjZYU3BvhyXndkT6f12cef3I96NY3WJ
|
||||
0uIK4k04WrbzdYXMU3rN6NqlvbHqnI+E7aMCAwEAAaNQME4wHQYDVR0OBBYEFHHx
|
||||
2+vSPw9bECHj3O51KNo5VdWOMB8GA1UdIwQYMBaAFHHx2+vSPw9bECHj3O51KNo5
|
||||
VdWOMAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAH2eV5NcV3LBJHs9
|
||||
I+adbiTPg2vyumrGWwy73T0X8Dtchgt8wU7Q9b9Ucg2fOTmSSyS0iMqEu1Yb2ORB
|
||||
CknM9mixHC9PwEBbkGCom3VVkqdLwSP6gdILZgyLoH4i8sTUz+S1yGPepi+Vzhs7
|
||||
adOXtryjcGnwft6HdfKPNklMOHFnjw6uqpho54oj/z55jUpicY/8glDHdrr1bh3k
|
||||
MHuiWLGewHXPvxfG6UoUx1te65IhifVcJGFZDQwfEmhBflfCmtAJlZEsgTLlBBCh
|
||||
FHoXIyGOdq1chmRVocdGBCF8fUoGIbuF14r53rpvcbEKtKnnP8+96luKAZLq0a4n
|
||||
3lb92xM=
|
||||
-----END CERTIFICATE-----`
|
||||
|
||||
const clientCert string = `-----BEGIN CERTIFICATE-----
|
||||
MIICsjCCAZoCCQCcd8sOfstQLzANBgkqhkiG9w0BAQsFADAXMRUwEwYDVQQDDAxj
|
||||
YS1rOHMtc3RobG0wHhcNMTYxMTAyMDkyNTE1WhcNMTcxMTAyMDkyNTE1WjAfMR0w
|
||||
GwYDVQQDDBRhZG0tZGFuaWVsLWs4cy1zdGhsbTCCASIwDQYJKoZIhvcNAQEBBQAD
|
||||
ggEPADCCAQoCggEBAOMliaWyNEUJKM37vWCl5bGub3lMicyRAqGQyY/qxD9yKKM2
|
||||
FbucVcmWmg5vvTqQVl5rlQ+c7GI8OD6ptmFl8a26coEki7bFr8bkpSyBSEc5p27b
|
||||
Z0ORFSqBHWHQbr9PkxPLYW6T3gZYUtRYv3OQgGxLXlvUh85n/mQfuR3N1FgmShHo
|
||||
GtAFi/ht6leXa0Ms+jNSDLCmXpJm1GIEqgyKX7K3+g3vzo9coYqXq4XTa8Efs2v8
|
||||
SCwqWfBC3rHfgs/5DLB8WT4Kul8QzxkytzcaBQfRfzhSV6bkgm7oTzt2/1eRRsf4
|
||||
YnXzLE9YkCC9sAn+Owzqf+TYC1KRluWDfqqBTJUCAwEAATANBgkqhkiG9w0BAQsF
|
||||
AAOCAQEAdMsZg6edWGC+xngizn0uamrUg1ViaDqUsz0vpzY5NWLA4MsBc4EtxWRP
|
||||
ueQvjUimZ3U3+AX0YWNLIrH1FCVos2jdij/xkTUmHcwzr8rQy+B17cFi+a8jtpgw
|
||||
AU6WWoaAIEhhbWQfth/Diz3mivl1ARB+YqiWca2mjRPLTPcKJEURDVddQ423el0Q
|
||||
4JNxS5icu7T2zYTYHAo/cT9zVdLZl0xuLxYm3asK1IONJ/evxyVZima3il6MPvhe
|
||||
58Hwz+m+HdqHxi24b/1J/VKYbISG4huOQCdLzeNXgvwFlGPUmHSnnKo1/KbQDAR5
|
||||
llG/Sw5+FquFuChaA6l5KWy7F3bQyA==
|
||||
-----END CERTIFICATE-----`
|
||||
|
||||
const clientKey string = `-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIEpQIBAAKCAQEA4yWJpbI0RQkozfu9YKXlsa5veUyJzJECoZDJj+rEP3IoozYV
|
||||
u5xVyZaaDm+9OpBWXmuVD5zsYjw4Pqm2YWXxrbpygSSLtsWvxuSlLIFIRzmnbttn
|
||||
Q5EVKoEdYdBuv0+TE8thbpPeBlhS1Fi/c5CAbEteW9SHzmf+ZB+5Hc3UWCZKEega
|
||||
0AWL+G3qV5drQyz6M1IMsKZekmbUYgSqDIpfsrf6De/Oj1yhiperhdNrwR+za/xI
|
||||
LCpZ8ELesd+Cz/kMsHxZPgq6XxDPGTK3NxoFB9F/OFJXpuSCbuhPO3b/V5FGx/hi
|
||||
dfMsT1iQIL2wCf47DOp/5NgLUpGW5YN+qoFMlQIDAQABAoIBAQCzy4u312XeW1Cs
|
||||
Mx6EuOwmh59/ESFmBkZh4rxZKYgrfE5EWlQ7i5SwG4BX+wR6rbNfy6JSmHDXlTkk
|
||||
CKvvToVNcW6fYHEivDnVojhIERFIJ4+rhQmpBtcNLOQ3/4cZ8X/GxE6b+3lb5l+x
|
||||
64mnjPLKRaIr5/+TVuebEy0xNTJmjnJ7yiB2HRz7uXEQaVSk/P7KAkkyl/9J3/LM
|
||||
8N9AX1w6qDaNQZ4/P0++1H4SQenosM/b/GqGTomarEk/GE0NcB9rzmR9VCXa7FRh
|
||||
WV5jyt9vUrwIEiK/6nUnOkGO8Ei3kB7Y+e+2m6WdaNoU5RAfqXmXa0Q/a0lLRruf
|
||||
vTMo2WrBAoGBAPRaK4cx76Q+3SJ/wfznaPsMM06OSR8A3ctKdV+ip/lyKtb1W8Pz
|
||||
k8MYQDH7GwPtSu5QD8doL00pPjugZL/ba7X9nAsI+pinyEErfnB9y7ORNEjIYYzs
|
||||
DiqDKup7ANgw1gZvznWvb9Ge0WUSXvWS0pFkgootQAf+RmnnbWGH6l6RAoGBAO35
|
||||
aGUrLro5u9RD24uSXNU3NmojINIQFK5dHAT3yl0BBYstL43AEsye9lX95uMPTvOQ
|
||||
Cqcn42Hjp/bSe3n0ObyOZeXVrWcDFAfE0wwB1BkvL1lpgnFO9+VQORlH4w3Ppnpo
|
||||
jcPkR2TFeDaAYtvckhxe/Bk3OnuFmnsQ3VzM75fFAoGBAI6PvS2XeNU+yA3EtA01
|
||||
hg5SQ+zlHswz2TMuMeSmJZJnhY78f5mHlwIQOAPxGQXlf/4iP9J7en1uPpzTK3S0
|
||||
M9duK4hUqMA/w5oiIhbHjf0qDnMYVbG+V1V+SZ+cPBXmCDihKreGr5qBKnHpkfV8
|
||||
v9WL6o1rcRw4wiQvnaV1gsvBAoGBALtzVTczr6gDKCAIn5wuWy+cQSGTsBunjRLX
|
||||
xuVm5iEiV+KMYkPvAx/pKzMLP96lRVR3ptyKgAKwl7LFk3u50+zh4gQLr35QH2wL
|
||||
Lw7rNc3srAhrItPsFzqrWX6/cGuFoKYVS239l/sZzRppQPXcpb7xVvTp2whHcir0
|
||||
Wtnpl+TdAoGAGqKqo2KU3JoY3IuTDUk1dsNAm8jd9EWDh+s1x4aG4N79mwcss5GD
|
||||
FF8MbFPneK7xQd8L6HisKUDAUi2NOyynM81LAftPkvN6ZuUVeFDfCL4vCA0HUXLD
|
||||
+VrOhtUZkNNJlLMiVRJuQKUOGlg8PpObqYbstQAf/0/yFJMRHG82Tcg=
|
||||
-----END RSA PRIVATE KEY-----`
|
||||
@@ -4,8 +4,7 @@ import (
|
||||
"errors"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
"github.com/grafana/grafana/pkg/components/securejsondata"
|
||||
)
|
||||
|
||||
var (
|
||||
@@ -19,23 +18,13 @@ type PluginSetting struct {
|
||||
Enabled bool
|
||||
Pinned bool
|
||||
JsonData map[string]interface{}
|
||||
SecureJsonData SecureJsonData
|
||||
SecureJsonData securejsondata.SecureJsonData
|
||||
PluginVersion string
|
||||
|
||||
Created time.Time
|
||||
Updated time.Time
|
||||
}
|
||||
|
||||
type SecureJsonData map[string][]byte
|
||||
|
||||
func (s SecureJsonData) Decrypt() map[string]string {
|
||||
decrypted := make(map[string]string)
|
||||
for key, data := range s {
|
||||
decrypted[key] = string(util.Decrypt(data, setting.SecretKey))
|
||||
}
|
||||
return decrypted
|
||||
}
|
||||
|
||||
// ----------------------
|
||||
// COMMANDS
|
||||
|
||||
@@ -58,12 +47,8 @@ type UpdatePluginSettingVersionCmd struct {
|
||||
OrgId int64 `json:"-"`
|
||||
}
|
||||
|
||||
func (cmd *UpdatePluginSettingCmd) GetEncryptedJsonData() SecureJsonData {
|
||||
encrypted := make(SecureJsonData)
|
||||
for key, data := range cmd.SecureJsonData {
|
||||
encrypted[key] = util.Encrypt([]byte(data), setting.SecretKey)
|
||||
}
|
||||
return encrypted
|
||||
func (cmd *UpdatePluginSettingCmd) GetEncryptedJsonData() securejsondata.SecureJsonData {
|
||||
return securejsondata.GetEncryptedJsonData(cmd.SecureJsonData)
|
||||
}
|
||||
|
||||
// ---------------------
|
||||
|
||||
@@ -17,9 +17,9 @@ type AlertEvaluator interface {
|
||||
Eval(reducedValue null.Float) bool
|
||||
}
|
||||
|
||||
type NoDataEvaluator struct{}
|
||||
type NoValueEvaluator struct{}
|
||||
|
||||
func (e *NoDataEvaluator) Eval(reducedValue null.Float) bool {
|
||||
func (e *NoValueEvaluator) Eval(reducedValue null.Float) bool {
|
||||
return reducedValue.Valid == false
|
||||
}
|
||||
|
||||
@@ -118,8 +118,8 @@ func NewAlertEvaluator(model *simplejson.Json) (AlertEvaluator, error) {
|
||||
return newRangedEvaluator(typ, model)
|
||||
}
|
||||
|
||||
if typ == "no_data" {
|
||||
return &NoDataEvaluator{}, nil
|
||||
if typ == "no_value" {
|
||||
return &NoValueEvaluator{}, nil
|
||||
}
|
||||
|
||||
return nil, alerting.ValidationError{Reason: "Evaluator invalid evaluator type: " + typ}
|
||||
|
||||
@@ -44,15 +44,20 @@ func TestEvalutors(t *testing.T) {
|
||||
So(evalutorScenario(`{"type": "outside_range", "params": [100, 1] }`, 50), ShouldBeFalse)
|
||||
})
|
||||
|
||||
Convey("no_data", t, func() {
|
||||
So(evalutorScenario(`{"type": "no_data", "params": [] }`, 50), ShouldBeFalse)
|
||||
Convey("no_value", t, func() {
|
||||
Convey("should be false if serie have values", func() {
|
||||
So(evalutorScenario(`{"type": "no_value", "params": [] }`, 50), ShouldBeFalse)
|
||||
})
|
||||
|
||||
jsonModel, err := simplejson.NewJson([]byte(`{"type": "no_data", "params": [] }`))
|
||||
So(err, ShouldBeNil)
|
||||
Convey("should be true when the serie have no value", func() {
|
||||
jsonModel, err := simplejson.NewJson([]byte(`{"type": "no_value", "params": [] }`))
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
evaluator, err := NewAlertEvaluator(jsonModel)
|
||||
So(err, ShouldBeNil)
|
||||
evaluator, err := NewAlertEvaluator(jsonModel)
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
So(evaluator.Eval(null.FloatFromPtr(nil)), ShouldBeTrue)
|
||||
So(evaluator.Eval(null.FloatFromPtr(nil)), ShouldBeTrue)
|
||||
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
@@ -119,21 +119,9 @@ func (c *QueryCondition) getRequestForAlertRule(datasource *m.DataSource, timeRa
|
||||
TimeRange: timeRange,
|
||||
Queries: []*tsdb.Query{
|
||||
{
|
||||
RefId: "A",
|
||||
Model: c.Query.Model,
|
||||
DataSource: &tsdb.DataSourceInfo{
|
||||
Id: datasource.Id,
|
||||
Name: datasource.Name,
|
||||
PluginId: datasource.Type,
|
||||
Url: datasource.Url,
|
||||
User: datasource.User,
|
||||
Password: datasource.Password,
|
||||
Database: datasource.Database,
|
||||
BasicAuth: datasource.BasicAuth,
|
||||
BasicAuthUser: datasource.BasicAuthUser,
|
||||
BasicAuthPassword: datasource.BasicAuthPassword,
|
||||
JsonData: datasource.JsonData,
|
||||
},
|
||||
RefId: "A",
|
||||
Model: c.Query.Model,
|
||||
DataSource: datasource,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -27,13 +27,17 @@ func (s *SimpleReducer) Reduce(series *tsdb.TimeSeries) null.Float {
|
||||
|
||||
switch s.Type {
|
||||
case "avg":
|
||||
validPointsCount := 0
|
||||
for _, point := range series.Points {
|
||||
if point[0].Valid {
|
||||
value += point[0].Float64
|
||||
validPointsCount += 1
|
||||
allNull = false
|
||||
}
|
||||
}
|
||||
value = value / float64(len(series.Points))
|
||||
if validPointsCount > 0 {
|
||||
value = value / float64(validPointsCount)
|
||||
}
|
||||
case "sum":
|
||||
for _, point := range series.Points {
|
||||
if point[0].Valid {
|
||||
@@ -90,7 +94,6 @@ func (s *SimpleReducer) Reduce(series *tsdb.TimeSeries) null.Float {
|
||||
value = (values[(length/2)-1] + values[length/2]) / 2
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
if allNull {
|
||||
|
||||
@@ -11,11 +11,6 @@ import (
|
||||
|
||||
func TestSimpleReducer(t *testing.T) {
|
||||
Convey("Test simple reducer by calculating", t, func() {
|
||||
Convey("avg", func() {
|
||||
result := testReducer("avg", 1, 2, 3)
|
||||
So(result, ShouldEqual, float64(2))
|
||||
})
|
||||
|
||||
Convey("sum", func() {
|
||||
result := testReducer("sum", 1, 2, 3)
|
||||
So(result, ShouldEqual, float64(6))
|
||||
@@ -55,6 +50,25 @@ func TestSimpleReducer(t *testing.T) {
|
||||
result := testReducer("median", 1)
|
||||
So(result, ShouldEqual, float64(1))
|
||||
})
|
||||
|
||||
Convey("avg", func() {
|
||||
result := testReducer("avg", 1, 2, 3)
|
||||
So(result, ShouldEqual, float64(2))
|
||||
})
|
||||
|
||||
Convey("avg of number values and null values should ignore nulls", func() {
|
||||
reducer := NewSimpleReducer("avg")
|
||||
series := &tsdb.TimeSeries{
|
||||
Name: "test time serie",
|
||||
}
|
||||
|
||||
series.Points = append(series.Points, tsdb.NewTimePoint(null.FloatFrom(3), 1))
|
||||
series.Points = append(series.Points, tsdb.NewTimePoint(null.FloatFromPtr(nil), 2))
|
||||
series.Points = append(series.Points, tsdb.NewTimePoint(null.FloatFromPtr(nil), 3))
|
||||
series.Points = append(series.Points, tsdb.NewTimePoint(null.FloatFrom(3), 4))
|
||||
|
||||
So(reducer.Reduce(series).Float64, ShouldEqual, float64(3))
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
118
pkg/services/alerting/notifiers/opsgenie.go
Normal file
118
pkg/services/alerting/notifiers/opsgenie.go
Normal file
@@ -0,0 +1,118 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strconv"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/services/alerting"
|
||||
)
|
||||
|
||||
func init() {
|
||||
alerting.RegisterNotifier("opsgenie", NewOpsGenieNotifier)
|
||||
}
|
||||
|
||||
var (
|
||||
opsgenieCreateAlertURL string = "https://api.opsgenie.com/v1/json/alert"
|
||||
opsgenieCloseAlertURL string = "https://api.opsgenie.com/v1/json/alert/close"
|
||||
)
|
||||
|
||||
func NewOpsGenieNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
|
||||
autoClose := model.Settings.Get("autoClose").MustBool(true)
|
||||
apiKey := model.Settings.Get("apiKey").MustString()
|
||||
if apiKey == "" {
|
||||
return nil, alerting.ValidationError{Reason: "Could not find api key property in settings"}
|
||||
}
|
||||
|
||||
return &OpsGenieNotifier{
|
||||
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||
ApiKey: apiKey,
|
||||
AutoClose: autoClose,
|
||||
log: log.New("alerting.notifier.opsgenie"),
|
||||
}, nil
|
||||
}
|
||||
|
||||
type OpsGenieNotifier struct {
|
||||
NotifierBase
|
||||
ApiKey string
|
||||
AutoClose bool
|
||||
log log.Logger
|
||||
}
|
||||
|
||||
func (this *OpsGenieNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
metrics.M_Alerting_Notification_Sent_OpsGenie.Inc(1)
|
||||
|
||||
var err error
|
||||
switch evalContext.Rule.State {
|
||||
case m.AlertStateOK:
|
||||
if this.AutoClose {
|
||||
err = this.closeAlert(evalContext)
|
||||
}
|
||||
case m.AlertStateAlerting:
|
||||
err = this.createAlert(evalContext)
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func (this *OpsGenieNotifier) createAlert(evalContext *alerting.EvalContext) error {
|
||||
this.log.Info("Creating OpsGenie alert", "ruleId", evalContext.Rule.Id, "notification", this.Name)
|
||||
|
||||
ruleUrl, err := evalContext.GetRuleUrl()
|
||||
if err != nil {
|
||||
this.log.Error("Failed get rule link", "error", err)
|
||||
return err
|
||||
}
|
||||
|
||||
bodyJSON := simplejson.New()
|
||||
bodyJSON.Set("apiKey", this.ApiKey)
|
||||
bodyJSON.Set("message", evalContext.Rule.Name)
|
||||
bodyJSON.Set("source", "Grafana")
|
||||
bodyJSON.Set("alias", "alertId-"+strconv.FormatInt(evalContext.Rule.Id, 10))
|
||||
bodyJSON.Set("description", fmt.Sprintf("%s - %s\n%s", evalContext.Rule.Name, ruleUrl, evalContext.Rule.Message))
|
||||
|
||||
details := simplejson.New()
|
||||
details.Set("url", ruleUrl)
|
||||
if evalContext.ImagePublicUrl != "" {
|
||||
details.Set("image", evalContext.ImagePublicUrl)
|
||||
}
|
||||
|
||||
bodyJSON.Set("details", details)
|
||||
body, _ := bodyJSON.MarshalJSON()
|
||||
|
||||
cmd := &m.SendWebhookSync{
|
||||
Url: opsgenieCreateAlertURL,
|
||||
Body: string(body),
|
||||
HttpMethod: "POST",
|
||||
}
|
||||
|
||||
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {
|
||||
this.log.Error("Failed to send notification to OpsGenie", "error", err, "body", string(body))
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (this *OpsGenieNotifier) closeAlert(evalContext *alerting.EvalContext) error {
|
||||
this.log.Info("Closing OpsGenie alert", "ruleId", evalContext.Rule.Id, "notification", this.Name)
|
||||
|
||||
bodyJSON := simplejson.New()
|
||||
bodyJSON.Set("apiKey", this.ApiKey)
|
||||
bodyJSON.Set("alias", "alertId-"+strconv.FormatInt(evalContext.Rule.Id, 10))
|
||||
body, _ := bodyJSON.MarshalJSON()
|
||||
|
||||
cmd := &m.SendWebhookSync{
|
||||
Url: opsgenieCloseAlertURL,
|
||||
Body: string(body),
|
||||
HttpMethod: "POST",
|
||||
}
|
||||
|
||||
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {
|
||||
this.log.Error("Failed to send notification to OpsGenie", "error", err, "body", string(body))
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
52
pkg/services/alerting/notifiers/opsgenie_test.go
Normal file
52
pkg/services/alerting/notifiers/opsgenie_test.go
Normal file
@@ -0,0 +1,52 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
func TestOpsGenieNotifier(t *testing.T) {
|
||||
Convey("OpsGenie notifier tests", t, func() {
|
||||
|
||||
Convey("Parsing alert notification from settings", func() {
|
||||
Convey("empty settings should return error", func() {
|
||||
json := `{ }`
|
||||
|
||||
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||
model := &m.AlertNotification{
|
||||
Name: "opsgenie_testing",
|
||||
Type: "opsgenie",
|
||||
Settings: settingsJSON,
|
||||
}
|
||||
|
||||
_, err := NewOpsGenieNotifier(model)
|
||||
So(err, ShouldNotBeNil)
|
||||
})
|
||||
|
||||
Convey("settings should trigger incident", func() {
|
||||
json := `
|
||||
{
|
||||
"apiKey": "abcdefgh0123456789"
|
||||
}`
|
||||
|
||||
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||
model := &m.AlertNotification{
|
||||
Name: "opsgenie_testing",
|
||||
Type: "opsgenie",
|
||||
Settings: settingsJSON,
|
||||
}
|
||||
|
||||
not, err := NewOpsGenieNotifier(model)
|
||||
opsgenieNotifier := not.(*OpsGenieNotifier)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(opsgenieNotifier.Name, ShouldEqual, "opsgenie_testing")
|
||||
So(opsgenieNotifier.Type, ShouldEqual, "opsgenie")
|
||||
So(opsgenieNotifier.ApiKey, ShouldEqual, "abcdefgh0123456789")
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
100
pkg/services/alerting/notifiers/victorops.go
Normal file
100
pkg/services/alerting/notifiers/victorops.go
Normal file
@@ -0,0 +1,100 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/services/alerting"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
)
|
||||
|
||||
// AlertStateCritical - Victorops uses "CRITICAL" string to indicate "Alerting" state
|
||||
const AlertStateCritical = "CRITICAL"
|
||||
|
||||
func init() {
|
||||
alerting.RegisterNotifier("victorops", NewVictoropsNotifier)
|
||||
}
|
||||
|
||||
// NewVictoropsNotifier creates an instance of VictoropsNotifier that
|
||||
// handles posting notifications to Victorops REST API
|
||||
func NewVictoropsNotifier(model *models.AlertNotification) (alerting.Notifier, error) {
|
||||
url := model.Settings.Get("url").MustString()
|
||||
if url == "" {
|
||||
return nil, alerting.ValidationError{Reason: "Could not find victorops url property in settings"}
|
||||
}
|
||||
|
||||
return &VictoropsNotifier{
|
||||
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||
URL: url,
|
||||
log: log.New("alerting.notifier.victorops"),
|
||||
}, nil
|
||||
}
|
||||
|
||||
// VictoropsNotifier defines URL property for Victorops REST API
|
||||
// and handles notification process by formatting POST body according to
|
||||
// Victorops specifications (http://victorops.force.com/knowledgebase/articles/Integration/Alert-Ingestion-API-Documentation/)
|
||||
type VictoropsNotifier struct {
|
||||
NotifierBase
|
||||
URL string
|
||||
log log.Logger
|
||||
}
|
||||
|
||||
// Notify sends notification to Victorops via POST to URL endpoint
|
||||
func (this *VictoropsNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
this.log.Info("Executing victorops notification", "ruleId", evalContext.Rule.Id, "notification", this.Name)
|
||||
metrics.M_Alerting_Notification_Sent_Victorops.Inc(1)
|
||||
|
||||
ruleUrl, err := evalContext.GetRuleUrl()
|
||||
if err != nil {
|
||||
this.log.Error("Failed get rule link", "error", err)
|
||||
return err
|
||||
}
|
||||
|
||||
fields := make([]map[string]interface{}, 0)
|
||||
fieldLimitCount := 4
|
||||
for index, evt := range evalContext.EvalMatches {
|
||||
fields = append(fields, map[string]interface{}{
|
||||
"title": evt.Metric,
|
||||
"value": evt.Value,
|
||||
"short": true,
|
||||
})
|
||||
if index > fieldLimitCount {
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if evalContext.Error != nil {
|
||||
fields = append(fields, map[string]interface{}{
|
||||
"title": "Error message",
|
||||
"value": evalContext.Error.Error(),
|
||||
"short": false,
|
||||
})
|
||||
}
|
||||
|
||||
messageType := evalContext.Rule.State
|
||||
if evalContext.Rule.State == models.AlertStateAlerting { // translate 'Alerting' to 'CRITICAL' (Victorops analog)
|
||||
messageType = AlertStateCritical
|
||||
}
|
||||
|
||||
body := map[string]interface{}{
|
||||
"message_type": messageType,
|
||||
"entity_id": evalContext.Rule.Name,
|
||||
"timestamp": time.Now().Unix(),
|
||||
"state_start_time": evalContext.StartTime.Unix(),
|
||||
"state_message": evalContext.Rule.Message + "\n" + ruleUrl,
|
||||
"monitoring_tool": "Grafana v" + setting.BuildVersion,
|
||||
}
|
||||
|
||||
data, _ := json.Marshal(&body)
|
||||
cmd := &models.SendWebhookSync{Url: this.URL, Body: string(data)}
|
||||
|
||||
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {
|
||||
this.log.Error("Failed to send victorops notification", "error", err, "webhook", this.Name)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
52
pkg/services/alerting/notifiers/victorops_test.go
Normal file
52
pkg/services/alerting/notifiers/victorops_test.go
Normal file
@@ -0,0 +1,52 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
func TestVictoropsNotifier(t *testing.T) {
|
||||
Convey("Victorops notifier tests", t, func() {
|
||||
|
||||
Convey("Parsing alert notification from settings", func() {
|
||||
Convey("empty settings should return error", func() {
|
||||
json := `{ }`
|
||||
|
||||
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||
model := &m.AlertNotification{
|
||||
Name: "victorops_testing",
|
||||
Type: "victorops",
|
||||
Settings: settingsJSON,
|
||||
}
|
||||
|
||||
_, err := NewVictoropsNotifier(model)
|
||||
So(err, ShouldNotBeNil)
|
||||
})
|
||||
|
||||
Convey("from settings", func() {
|
||||
json := `
|
||||
{
|
||||
"url": "http://google.com"
|
||||
}`
|
||||
|
||||
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||
model := &m.AlertNotification{
|
||||
Name: "victorops_testing",
|
||||
Type: "victorops",
|
||||
Settings: settingsJSON,
|
||||
}
|
||||
|
||||
not, err := NewVictoropsNotifier(model)
|
||||
victoropsNotifier := not.(*VictoropsNotifier)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(victoropsNotifier.Name, ShouldEqual, "victorops_testing")
|
||||
So(victoropsNotifier.Type, ShouldEqual, "victorops")
|
||||
So(victoropsNotifier.URL, ShouldEqual, "http://google.com")
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
@@ -58,6 +58,10 @@ func (this *WebhookNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
bodyJSON.Set("imageUrl", evalContext.ImagePublicUrl)
|
||||
}
|
||||
|
||||
if evalContext.Rule.Message != "" {
|
||||
bodyJSON.Set("message", evalContext.Rule.Message)
|
||||
}
|
||||
|
||||
body, _ := bodyJSON.MarshalJSON()
|
||||
|
||||
cmd := &m.SendWebhookSync{
|
||||
|
||||
@@ -59,7 +59,7 @@ func (arr *DefaultRuleReader) Fetch() []*Rule {
|
||||
}
|
||||
}
|
||||
|
||||
metrics.M_Alerting_Active_Alerts.Inc(int64(len(res)))
|
||||
metrics.M_Alerting_Active_Alerts.Update(int64(len(res)))
|
||||
return res
|
||||
}
|
||||
|
||||
|
||||
@@ -6,7 +6,6 @@ import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"net/http"
|
||||
"time"
|
||||
|
||||
"golang.org/x/net/context/ctxhttp"
|
||||
|
||||
@@ -22,8 +21,10 @@ type Webhook struct {
|
||||
HttpMethod string
|
||||
}
|
||||
|
||||
var webhookQueue chan *Webhook
|
||||
var webhookLog log.Logger
|
||||
var (
|
||||
webhookQueue chan *Webhook
|
||||
webhookLog log.Logger
|
||||
)
|
||||
|
||||
func initWebhookQueue() {
|
||||
webhookLog = log.New("notifications.webhook")
|
||||
@@ -47,24 +48,22 @@ func processWebhookQueue() {
|
||||
func sendWebRequestSync(ctx context.Context, webhook *Webhook) error {
|
||||
webhookLog.Debug("Sending webhook", "url", webhook.Url, "http method", webhook.HttpMethod)
|
||||
|
||||
client := &http.Client{
|
||||
Timeout: time.Duration(10 * time.Second),
|
||||
}
|
||||
|
||||
if webhook.HttpMethod == "" {
|
||||
webhook.HttpMethod = http.MethodPost
|
||||
}
|
||||
|
||||
request, err := http.NewRequest(webhook.HttpMethod, webhook.Url, bytes.NewReader([]byte(webhook.Body)))
|
||||
if webhook.User != "" && webhook.Password != "" {
|
||||
request.Header.Add("Authorization", util.GetBasicAuthHeader(webhook.User, webhook.Password))
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
resp, err := ctxhttp.Do(ctx, client, request)
|
||||
request.Header.Add("Content-Type", "application/json")
|
||||
request.Header.Add("User-Agent", "Grafana")
|
||||
if webhook.User != "" && webhook.Password != "" {
|
||||
request.Header.Add("Authorization", util.GetBasicAuthHeader(webhook.User, webhook.Password))
|
||||
}
|
||||
|
||||
resp, err := ctxhttp.Do(ctx, http.DefaultClient, request)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
@@ -73,11 +72,11 @@ func sendWebRequestSync(ctx context.Context, webhook *Webhook) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
defer resp.Body.Close()
|
||||
body, err := ioutil.ReadAll(resp.Body)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
webhookLog.Debug("Webhook failed", "statuscode", resp.Status, "body", string(body))
|
||||
return fmt.Errorf("Webhook response status %v", resp.Status)
|
||||
|
||||
@@ -4,6 +4,7 @@ import (
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/securejsondata"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
@@ -82,6 +83,7 @@ func AddDataSource(cmd *m.AddDataSourceCommand) error {
|
||||
BasicAuthPassword: cmd.BasicAuthPassword,
|
||||
WithCredentials: cmd.WithCredentials,
|
||||
JsonData: cmd.JsonData,
|
||||
SecureJsonData: securejsondata.GetEncryptedJsonData(cmd.SecureJsonData),
|
||||
Created: time.Now(),
|
||||
Updated: time.Now(),
|
||||
}
|
||||
@@ -128,6 +130,7 @@ func UpdateDataSource(cmd *m.UpdateDataSourceCommand) error {
|
||||
BasicAuthPassword: cmd.BasicAuthPassword,
|
||||
WithCredentials: cmd.WithCredentials,
|
||||
JsonData: cmd.JsonData,
|
||||
SecureJsonData: securejsondata.GetEncryptedJsonData(cmd.SecureJsonData),
|
||||
Updated: time.Now(),
|
||||
}
|
||||
|
||||
|
||||
@@ -101,4 +101,9 @@ func addDataSourceMigration(mg *Migrator) {
|
||||
mg.AddMigration("Add column with_credentials", NewAddColumnMigration(tableV2, &Column{
|
||||
Name: "with_credentials", Type: DB_Bool, Nullable: false, Default: "0",
|
||||
}))
|
||||
|
||||
// add column that can store TLS client auth data
|
||||
mg.AddMigration("Add secure json data column", NewAddColumnMigration(tableV2, &Column{
|
||||
Name: "secure_json_data", Type: DB_Text, Nullable: true,
|
||||
}))
|
||||
}
|
||||
|
||||
@@ -176,6 +176,11 @@ func UpdateOrgAddress(cmd *m.UpdateOrgAddressCommand) error {
|
||||
|
||||
func DeleteOrg(cmd *m.DeleteOrgCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
if res, err := sess.Query("SELECT 1 from org WHERE id=?", cmd.Id); err != nil {
|
||||
return err
|
||||
} else if len(res) != 1 {
|
||||
return m.ErrOrgNotFound
|
||||
}
|
||||
|
||||
deletes := []string{
|
||||
"DELETE FROM star WHERE EXISTS (SELECT 1 FROM dashboard WHERE org_id = ? AND star.dashboard_id = dashboard.id)",
|
||||
|
||||
@@ -23,12 +23,13 @@ import (
|
||||
_ "github.com/mattn/go-sqlite3"
|
||||
)
|
||||
|
||||
type MySQLConfig struct {
|
||||
SslMode string
|
||||
CaCertPath string
|
||||
ClientKeyPath string
|
||||
ClientCertPath string
|
||||
ServerCertName string
|
||||
|
||||
type DatabaseConfig struct {
|
||||
Type, Host, Name, User, Pwd, Path, SslMode string
|
||||
CaCertPath string
|
||||
ClientKeyPath string
|
||||
ClientCertPath string
|
||||
ServerCertName string
|
||||
}
|
||||
|
||||
var (
|
||||
@@ -37,11 +38,8 @@ var (
|
||||
|
||||
HasEngine bool
|
||||
|
||||
DbCfg struct {
|
||||
Type, Host, Name, User, Pwd, Path, SslMode string
|
||||
}
|
||||
DbCfg DatabaseConfig
|
||||
|
||||
mysqlConfig MySQLConfig
|
||||
UseSQLite3 bool
|
||||
sqlog log.Logger = log.New("sqlstore")
|
||||
)
|
||||
@@ -118,8 +116,8 @@ func getEngine() (*xorm.Engine, error) {
|
||||
cnnstr = fmt.Sprintf("%s:%s@%s(%s)/%s?charset=utf8",
|
||||
DbCfg.User, DbCfg.Pwd, protocol, DbCfg.Host, DbCfg.Name)
|
||||
|
||||
if mysqlConfig.SslMode == "true" || mysqlConfig.SslMode == "skip-verify" {
|
||||
tlsCert, err := makeCert("custom", mysqlConfig)
|
||||
if DbCfg.SslMode == "true" || DbCfg.SslMode == "skip-verify" {
|
||||
tlsCert, err := makeCert("custom", DbCfg)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
@@ -141,7 +139,7 @@ func getEngine() (*xorm.Engine, error) {
|
||||
if DbCfg.User == "" {
|
||||
DbCfg.User = "''"
|
||||
}
|
||||
cnnstr = fmt.Sprintf("user=%s password=%s host=%s port=%s dbname=%s sslmode=%s", DbCfg.User, DbCfg.Pwd, host, port, DbCfg.Name, DbCfg.SslMode)
|
||||
cnnstr = fmt.Sprintf("user=%s password=%s host=%s port=%s dbname=%s sslmode=%s sslcert=%s sslkey=%s sslrootcert=%s", DbCfg.User, DbCfg.Pwd, host, port, DbCfg.Name, DbCfg.SslMode, DbCfg.ClientCertPath, DbCfg.ClientKeyPath, DbCfg.CaCertPath)
|
||||
case "sqlite3":
|
||||
if !filepath.IsAbs(DbCfg.Path) {
|
||||
DbCfg.Path = filepath.Join(setting.DataPath, DbCfg.Path)
|
||||
@@ -189,13 +187,9 @@ func LoadConfig() {
|
||||
UseSQLite3 = true
|
||||
}
|
||||
DbCfg.SslMode = sec.Key("ssl_mode").String()
|
||||
DbCfg.CaCertPath = sec.Key("ca_cert_path").String()
|
||||
DbCfg.ClientKeyPath = sec.Key("client_key_path").String()
|
||||
DbCfg.ClientCertPath = sec.Key("client_cert_path").String()
|
||||
DbCfg.ServerCertName = sec.Key("server_cert_name").String()
|
||||
DbCfg.Path = sec.Key("path").MustString("data/grafana.db")
|
||||
|
||||
if DbCfg.Type == "mysql" {
|
||||
mysqlConfig.SslMode = DbCfg.SslMode
|
||||
mysqlConfig.CaCertPath = sec.Key("ca_cert_path").String()
|
||||
mysqlConfig.ClientKeyPath = sec.Key("client_key_path").String()
|
||||
mysqlConfig.ClientCertPath = sec.Key("client_cert_path").String()
|
||||
mysqlConfig.ServerCertName = sec.Key("server_cert_name").String()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7,7 +7,7 @@ import (
|
||||
"io/ioutil"
|
||||
)
|
||||
|
||||
func makeCert(tlsPoolName string, config MySQLConfig) (*tls.Config, error) {
|
||||
func makeCert(tlsPoolName string, config DatabaseConfig) (*tls.Config, error) {
|
||||
rootCertPool := x509.NewCertPool()
|
||||
pem, err := ioutil.ReadFile(config.CaCertPath)
|
||||
if err != nil {
|
||||
|
||||
@@ -325,11 +325,12 @@ func loadSpecifedConfigFile(configFile string) error {
|
||||
}
|
||||
|
||||
userConfig, err := ini.Load(configFile)
|
||||
userConfig.BlockMode = false
|
||||
if err != nil {
|
||||
return fmt.Errorf("Failed to parse %v, %v", configFile, err)
|
||||
}
|
||||
|
||||
userConfig.BlockMode = false
|
||||
|
||||
for _, section := range userConfig.Sections() {
|
||||
for _, key := range section.Keys() {
|
||||
if key.Value() == "" {
|
||||
@@ -359,9 +360,18 @@ func loadConfiguration(args *CommandLineArgs) {
|
||||
defaultConfigFile := path.Join(HomePath, "conf/defaults.ini")
|
||||
configFiles = append(configFiles, defaultConfigFile)
|
||||
|
||||
// check if config file exists
|
||||
if _, err := os.Stat(defaultConfigFile); os.IsNotExist(err) {
|
||||
fmt.Println("Grafana-server Init Failed: Could not find config defaults, make sure homepath command line parameter is set or working directory is homepath")
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// load defaults
|
||||
Cfg, err = ini.Load(defaultConfigFile)
|
||||
if err != nil {
|
||||
log.Fatal(3, "Failed to parse defaults.ini, %v", err)
|
||||
fmt.Println(fmt.Sprintf("Failed to parse defaults.ini, %v", err))
|
||||
os.Exit(1)
|
||||
return
|
||||
}
|
||||
|
||||
Cfg.BlockMode = false
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
package tsdb
|
||||
|
||||
import (
|
||||
"context"
|
||||
"errors"
|
||||
)
|
||||
import "context"
|
||||
|
||||
type Batch struct {
|
||||
DataSourceId int64
|
||||
@@ -24,12 +21,12 @@ func newBatch(dsId int64, queries QuerySlice) *Batch {
|
||||
}
|
||||
|
||||
func (bg *Batch) process(ctx context.Context, queryContext *QueryContext) {
|
||||
executor := getExecutorFor(bg.Queries[0].DataSource)
|
||||
executor, err := getExecutorFor(bg.Queries[0].DataSource)
|
||||
|
||||
if executor == nil {
|
||||
if err != nil {
|
||||
bg.Done = true
|
||||
result := &BatchResult{
|
||||
Error: errors.New("Could not find executor for data source type: " + bg.Queries[0].DataSource.PluginId),
|
||||
Error: err,
|
||||
QueryResults: make(map[string]*QueryResult),
|
||||
}
|
||||
for _, query := range bg.Queries {
|
||||
|
||||
@@ -1,6 +1,11 @@
|
||||
package tsdb
|
||||
|
||||
import "context"
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
type Executor interface {
|
||||
Execute(ctx context.Context, queries QuerySlice, query *QueryContext) *BatchResult
|
||||
@@ -8,17 +13,22 @@ type Executor interface {
|
||||
|
||||
var registry map[string]GetExecutorFn
|
||||
|
||||
type GetExecutorFn func(dsInfo *DataSourceInfo) Executor
|
||||
type GetExecutorFn func(dsInfo *models.DataSource) (Executor, error)
|
||||
|
||||
func init() {
|
||||
registry = make(map[string]GetExecutorFn)
|
||||
}
|
||||
|
||||
func getExecutorFor(dsInfo *DataSourceInfo) Executor {
|
||||
if fn, exists := registry[dsInfo.PluginId]; exists {
|
||||
return fn(dsInfo)
|
||||
func getExecutorFor(dsInfo *models.DataSource) (Executor, error) {
|
||||
if fn, exists := registry[dsInfo.Type]; exists {
|
||||
executor, err := fn(dsInfo)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return executor, nil
|
||||
}
|
||||
return nil
|
||||
return nil, fmt.Errorf("Could not find executor for data source type: %s", dsInfo.Type)
|
||||
}
|
||||
|
||||
func RegisterExecutor(pluginId string, fn GetExecutorFn) {
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
package tsdb
|
||||
|
||||
import "context"
|
||||
import (
|
||||
"context"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
type FakeExecutor struct {
|
||||
results map[string]*QueryResult
|
||||
@@ -9,11 +13,11 @@ type FakeExecutor struct {
|
||||
|
||||
type ResultsFn func(context *QueryContext) *QueryResult
|
||||
|
||||
func NewFakeExecutor(dsInfo *DataSourceInfo) *FakeExecutor {
|
||||
func NewFakeExecutor(dsInfo *models.DataSource) (*FakeExecutor, error) {
|
||||
return &FakeExecutor{
|
||||
results: make(map[string]*QueryResult),
|
||||
resultsFn: make(map[string]ResultsFn),
|
||||
}
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (e *FakeExecutor) Execute(ctx context.Context, queries QuerySlice, context *QueryContext) *BatchResult {
|
||||
|
||||
@@ -14,28 +14,36 @@ import (
|
||||
"golang.org/x/net/context/ctxhttp"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
)
|
||||
|
||||
type GraphiteExecutor struct {
|
||||
*tsdb.DataSourceInfo
|
||||
*models.DataSource
|
||||
HttpClient *http.Client
|
||||
}
|
||||
|
||||
func NewGraphiteExecutor(dsInfo *tsdb.DataSourceInfo) tsdb.Executor {
|
||||
return &GraphiteExecutor{dsInfo}
|
||||
func NewGraphiteExecutor(datasource *models.DataSource) (tsdb.Executor, error) {
|
||||
httpClient, err := datasource.GetHttpClient()
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &GraphiteExecutor{
|
||||
DataSource: datasource,
|
||||
HttpClient: httpClient,
|
||||
}, nil
|
||||
}
|
||||
|
||||
var (
|
||||
glog log.Logger
|
||||
HttpClient *http.Client
|
||||
glog log.Logger
|
||||
)
|
||||
|
||||
func init() {
|
||||
glog = log.New("tsdb.graphite")
|
||||
tsdb.RegisterExecutor("graphite", NewGraphiteExecutor)
|
||||
|
||||
HttpClient = tsdb.GetDefaultClient()
|
||||
}
|
||||
|
||||
func (e *GraphiteExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice, context *tsdb.QueryContext) *tsdb.BatchResult {
|
||||
@@ -66,7 +74,7 @@ func (e *GraphiteExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice,
|
||||
return result
|
||||
}
|
||||
|
||||
res, err := ctxhttp.Do(ctx, HttpClient, req)
|
||||
res, err := ctxhttp.Do(ctx, e.HttpClient, req)
|
||||
if err != nil {
|
||||
result.Error = err
|
||||
return result
|
||||
|
||||
@@ -1,29 +0,0 @@
|
||||
package tsdb
|
||||
|
||||
import (
|
||||
"crypto/tls"
|
||||
"net"
|
||||
"net/http"
|
||||
"time"
|
||||
)
|
||||
|
||||
func GetDefaultClient() *http.Client {
|
||||
tr := &http.Transport{
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
DialContext: (&net.Dialer{
|
||||
Timeout: 30 * time.Second,
|
||||
KeepAlive: 30 * time.Second,
|
||||
}).DialContext,
|
||||
MaxIdleConns: 100,
|
||||
IdleConnTimeout: 90 * time.Second,
|
||||
TLSHandshakeTimeout: 10 * time.Second,
|
||||
ExpectContinueTimeout: 1 * time.Second,
|
||||
|
||||
TLSClientConfig: &tls.Config{InsecureSkipVerify: true},
|
||||
}
|
||||
|
||||
return &http.Client{
|
||||
Timeout: time.Duration(30 * time.Second),
|
||||
Transport: tr,
|
||||
}
|
||||
}
|
||||
@@ -11,34 +11,40 @@ import (
|
||||
"golang.org/x/net/context/ctxhttp"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
)
|
||||
|
||||
type InfluxDBExecutor struct {
|
||||
*tsdb.DataSourceInfo
|
||||
*models.DataSource
|
||||
QueryParser *InfluxdbQueryParser
|
||||
ResponseParser *ResponseParser
|
||||
HttpClient *http.Client
|
||||
}
|
||||
|
||||
func NewInfluxDBExecutor(dsInfo *tsdb.DataSourceInfo) tsdb.Executor {
|
||||
func NewInfluxDBExecutor(datasource *models.DataSource) (tsdb.Executor, error) {
|
||||
httpClient, err := datasource.GetHttpClient()
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &InfluxDBExecutor{
|
||||
DataSourceInfo: dsInfo,
|
||||
DataSource: datasource,
|
||||
QueryParser: &InfluxdbQueryParser{},
|
||||
ResponseParser: &ResponseParser{},
|
||||
}
|
||||
HttpClient: httpClient,
|
||||
}, nil
|
||||
}
|
||||
|
||||
var (
|
||||
glog log.Logger
|
||||
HttpClient *http.Client
|
||||
glog log.Logger
|
||||
)
|
||||
|
||||
func init() {
|
||||
glog = log.New("tsdb.influxdb")
|
||||
tsdb.RegisterExecutor("influxdb", NewInfluxDBExecutor)
|
||||
|
||||
HttpClient = tsdb.GetDefaultClient()
|
||||
}
|
||||
|
||||
func (e *InfluxDBExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice, context *tsdb.QueryContext) *tsdb.BatchResult {
|
||||
@@ -63,7 +69,7 @@ func (e *InfluxDBExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice,
|
||||
return result.WithError(err)
|
||||
}
|
||||
|
||||
resp, err := ctxhttp.Do(ctx, HttpClient, req)
|
||||
resp, err := ctxhttp.Do(ctx, e.HttpClient, req)
|
||||
if err != nil {
|
||||
return result.WithError(err)
|
||||
}
|
||||
@@ -95,7 +101,7 @@ func (e *InfluxDBExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice,
|
||||
func (e *InfluxDBExecutor) getQuery(queries tsdb.QuerySlice, context *tsdb.QueryContext) (*Query, error) {
|
||||
for _, v := range queries {
|
||||
|
||||
query, err := e.QueryParser.Parse(v.Model, e.DataSourceInfo)
|
||||
query, err := e.QueryParser.Parse(v.Model, e.DataSource)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
@@ -127,7 +133,7 @@ func (e *InfluxDBExecutor) createRequest(query string) (*http.Request, error) {
|
||||
req.SetBasicAuth(e.BasicAuthUser, e.BasicAuthPassword)
|
||||
}
|
||||
|
||||
if e.User != "" {
|
||||
if !e.BasicAuth && e.User != "" {
|
||||
req.SetBasicAuth(e.User, e.Password)
|
||||
}
|
||||
|
||||
|
||||
@@ -4,12 +4,12 @@ import (
|
||||
"strconv"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
type InfluxdbQueryParser struct{}
|
||||
|
||||
func (qp *InfluxdbQueryParser) Parse(model *simplejson.Json, dsInfo *tsdb.DataSourceInfo) (*Query, error) {
|
||||
func (qp *InfluxdbQueryParser) Parse(model *simplejson.Json, dsInfo *models.DataSource) (*Query, error) {
|
||||
policy := model.Get("policy").MustString("default")
|
||||
rawQuery := model.Get("query").MustString("")
|
||||
useRawQuery := model.Get("rawQuery").MustBool(false)
|
||||
|
||||
@@ -4,7 +4,7 @@ import (
|
||||
"testing"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
@@ -12,7 +12,7 @@ func TestInfluxdbQueryParser(t *testing.T) {
|
||||
Convey("Influxdb query parser", t, func() {
|
||||
|
||||
parser := &InfluxdbQueryParser{}
|
||||
dsInfo := &tsdb.DataSourceInfo{
|
||||
dsInfo := &models.DataSource{
|
||||
JsonData: simplejson.New(),
|
||||
}
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ package tsdb
|
||||
|
||||
import (
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"gopkg.in/guregu/null.v3"
|
||||
)
|
||||
|
||||
@@ -9,7 +10,7 @@ type Query struct {
|
||||
RefId string
|
||||
Model *simplejson.Json
|
||||
Depends []string
|
||||
DataSource *DataSourceInfo
|
||||
DataSource *models.DataSource
|
||||
Results []*TimeSeries
|
||||
Exclude bool
|
||||
MaxDataPoints int64
|
||||
@@ -28,20 +29,6 @@ type Response struct {
|
||||
Results map[string]*QueryResult `json:"results"`
|
||||
}
|
||||
|
||||
type DataSourceInfo struct {
|
||||
Id int64
|
||||
Name string
|
||||
PluginId string
|
||||
Url string
|
||||
Password string
|
||||
User string
|
||||
Database string
|
||||
BasicAuth bool
|
||||
BasicAuthUser string
|
||||
BasicAuthPassword string
|
||||
JsonData *simplejson.Json
|
||||
}
|
||||
|
||||
type BatchTiming struct {
|
||||
TimeElapsed int64
|
||||
}
|
||||
|
||||
@@ -17,28 +17,36 @@ import (
|
||||
"gopkg.in/guregu/null.v3"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
)
|
||||
|
||||
type OpenTsdbExecutor struct {
|
||||
*tsdb.DataSourceInfo
|
||||
*models.DataSource
|
||||
httpClient *http.Client
|
||||
}
|
||||
|
||||
func NewOpenTsdbExecutor(dsInfo *tsdb.DataSourceInfo) tsdb.Executor {
|
||||
return &OpenTsdbExecutor{dsInfo}
|
||||
func NewOpenTsdbExecutor(datasource *models.DataSource) (tsdb.Executor, error) {
|
||||
httpClient, err := datasource.GetHttpClient()
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &OpenTsdbExecutor{
|
||||
DataSource: datasource,
|
||||
httpClient: httpClient,
|
||||
}, nil
|
||||
}
|
||||
|
||||
var (
|
||||
plog log.Logger
|
||||
HttpClient *http.Client
|
||||
plog log.Logger
|
||||
)
|
||||
|
||||
func init() {
|
||||
plog = log.New("tsdb.opentsdb")
|
||||
tsdb.RegisterExecutor("opentsdb", NewOpenTsdbExecutor)
|
||||
|
||||
HttpClient = tsdb.GetDefaultClient()
|
||||
}
|
||||
|
||||
func (e *OpenTsdbExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice, queryContext *tsdb.QueryContext) *tsdb.BatchResult {
|
||||
@@ -64,7 +72,7 @@ func (e *OpenTsdbExecutor) Execute(ctx context.Context, queries tsdb.QuerySlice,
|
||||
return result
|
||||
}
|
||||
|
||||
res, err := ctxhttp.Do(ctx, HttpClient, req)
|
||||
res, err := ctxhttp.Do(ctx, e.httpClient, req)
|
||||
if err != nil {
|
||||
result.Error = err
|
||||
return result
|
||||
|
||||
@@ -9,18 +9,30 @@ import (
|
||||
|
||||
"gopkg.in/guregu/null.v3"
|
||||
|
||||
"net/http"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
"github.com/prometheus/client_golang/api/prometheus"
|
||||
pmodel "github.com/prometheus/common/model"
|
||||
)
|
||||
|
||||
type PrometheusExecutor struct {
|
||||
*tsdb.DataSourceInfo
|
||||
*models.DataSource
|
||||
Transport *http.Transport
|
||||
}
|
||||
|
||||
func NewPrometheusExecutor(dsInfo *tsdb.DataSourceInfo) tsdb.Executor {
|
||||
return &PrometheusExecutor{dsInfo}
|
||||
func NewPrometheusExecutor(dsInfo *models.DataSource) (tsdb.Executor, error) {
|
||||
transport, err := dsInfo.GetHttpTransport()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &PrometheusExecutor{
|
||||
DataSource: dsInfo,
|
||||
Transport: transport,
|
||||
}, nil
|
||||
}
|
||||
|
||||
var (
|
||||
@@ -36,7 +48,8 @@ func init() {
|
||||
|
||||
func (e *PrometheusExecutor) getClient() (prometheus.QueryAPI, error) {
|
||||
cfg := prometheus.Config{
|
||||
Address: e.DataSourceInfo.Url,
|
||||
Address: e.DataSource.Url,
|
||||
Transport: e.Transport,
|
||||
}
|
||||
|
||||
client, err := prometheus.New(cfg)
|
||||
|
||||
2
pkg/tsdb/testdata/scenarios.go
vendored
2
pkg/tsdb/testdata/scenarios.go
vendored
@@ -90,6 +90,8 @@ func init() {
|
||||
queryRes := tsdb.NewQueryResult()
|
||||
|
||||
stringInput := query.Model.Get("stringInput").MustString()
|
||||
stringInput = strings.Replace(stringInput, " ", "", -1)
|
||||
|
||||
values := []null.Float{}
|
||||
for _, strVal := range strings.Split(stringInput, ",") {
|
||||
if strVal == "null" {
|
||||
|
||||
11
pkg/tsdb/testdata/testdata.go
vendored
11
pkg/tsdb/testdata/testdata.go
vendored
@@ -4,19 +4,20 @@ import (
|
||||
"context"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
)
|
||||
|
||||
type TestDataExecutor struct {
|
||||
*tsdb.DataSourceInfo
|
||||
*models.DataSource
|
||||
log log.Logger
|
||||
}
|
||||
|
||||
func NewTestDataExecutor(dsInfo *tsdb.DataSourceInfo) tsdb.Executor {
|
||||
func NewTestDataExecutor(dsInfo *models.DataSource) (tsdb.Executor, error) {
|
||||
return &TestDataExecutor{
|
||||
DataSourceInfo: dsInfo,
|
||||
log: log.New("tsdb.testdata"),
|
||||
}
|
||||
DataSource: dsInfo,
|
||||
log: log.New("tsdb.testdata"),
|
||||
}, nil
|
||||
}
|
||||
|
||||
func init() {
|
||||
|
||||
@@ -5,6 +5,7 @@ import (
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
@@ -15,9 +16,9 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("Given 3 queries for 2 data sources", func() {
|
||||
request := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1}},
|
||||
{RefId: "B", DataSource: &DataSourceInfo{Id: 1}},
|
||||
{RefId: "C", DataSource: &DataSourceInfo{Id: 2}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1}},
|
||||
{RefId: "B", DataSource: &models.DataSource{Id: 1}},
|
||||
{RefId: "C", DataSource: &models.DataSource{Id: 2}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -32,9 +33,9 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("Given query 2 depends on query 1", func() {
|
||||
request := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1}},
|
||||
{RefId: "B", DataSource: &DataSourceInfo{Id: 2}},
|
||||
{RefId: "C", DataSource: &DataSourceInfo{Id: 3}, Depends: []string{"A", "B"}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1}},
|
||||
{RefId: "B", DataSource: &models.DataSource{Id: 2}},
|
||||
{RefId: "C", DataSource: &models.DataSource{Id: 3}, Depends: []string{"A", "B"}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -56,7 +57,7 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("When executing request with one query", t, func() {
|
||||
req := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1, Type: "test"}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -75,8 +76,8 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("When executing one request with two queries from same data source", t, func() {
|
||||
req := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"}},
|
||||
{RefId: "B", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1, Type: "test"}},
|
||||
{RefId: "B", DataSource: &models.DataSource{Id: 1, Type: "test"}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -101,9 +102,9 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("When executing one request with three queries from different datasources", t, func() {
|
||||
req := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"}},
|
||||
{RefId: "B", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"}},
|
||||
{RefId: "C", DataSource: &DataSourceInfo{Id: 2, PluginId: "test"}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1, Type: "test"}},
|
||||
{RefId: "B", DataSource: &models.DataSource{Id: 1, Type: "test"}},
|
||||
{RefId: "C", DataSource: &models.DataSource{Id: 2, Type: "test"}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -118,7 +119,7 @@ func TestMetricQuery(t *testing.T) {
|
||||
Convey("When query uses data source of unknown type", t, func() {
|
||||
req := &Request{
|
||||
Queries: QuerySlice{
|
||||
{RefId: "A", DataSource: &DataSourceInfo{Id: 1, PluginId: "asdasdas"}},
|
||||
{RefId: "A", DataSource: &models.DataSource{Id: 1, Type: "asdasdas"}},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -130,10 +131,10 @@ func TestMetricQuery(t *testing.T) {
|
||||
req := &Request{
|
||||
Queries: QuerySlice{
|
||||
{
|
||||
RefId: "A", DataSource: &DataSourceInfo{Id: 1, PluginId: "test"},
|
||||
RefId: "A", DataSource: &models.DataSource{Id: 1, Type: "test"},
|
||||
},
|
||||
{
|
||||
RefId: "B", DataSource: &DataSourceInfo{Id: 2, PluginId: "test"}, Depends: []string{"A"},
|
||||
RefId: "B", DataSource: &models.DataSource{Id: 2, Type: "test"}, Depends: []string{"A"},
|
||||
},
|
||||
},
|
||||
}
|
||||
@@ -167,9 +168,9 @@ func TestMetricQuery(t *testing.T) {
|
||||
}
|
||||
|
||||
func registerFakeExecutor() *FakeExecutor {
|
||||
executor := NewFakeExecutor(nil)
|
||||
RegisterExecutor("test", func(dsInfo *DataSourceInfo) Executor {
|
||||
return executor
|
||||
executor, _ := NewFakeExecutor(nil)
|
||||
RegisterExecutor("test", func(dsInfo *models.DataSource) (Executor, error) {
|
||||
return executor, nil
|
||||
})
|
||||
|
||||
return executor
|
||||
|
||||
@@ -44,10 +44,6 @@ function setupAngularRoutes($routeProvider, $locationProvider) {
|
||||
templateUrl: 'public/app/features/dashboard/partials/dash_list.html',
|
||||
controller : 'DashListCtrl',
|
||||
})
|
||||
.when('/dashboards/migrate', {
|
||||
templateUrl: 'public/app/features/dashboard/partials/migrate.html',
|
||||
controller : 'DashboardImportCtrl',
|
||||
})
|
||||
.when('/datasources', {
|
||||
templateUrl: 'public/app/features/plugins/partials/ds_list.html',
|
||||
controller : 'DataSourcesCtrl',
|
||||
|
||||
@@ -89,6 +89,7 @@ export class KeybindingSrv {
|
||||
|
||||
this.bind('mod+o', () => {
|
||||
dashboard.sharedCrosshair = !dashboard.sharedCrosshair;
|
||||
appEvents.emit('graph-hover-clear');
|
||||
scope.broadcastRefresh();
|
||||
});
|
||||
|
||||
@@ -101,7 +102,11 @@ export class KeybindingSrv {
|
||||
});
|
||||
|
||||
this.bind('t z', () => {
|
||||
scope.appEvent('zoom-out');
|
||||
scope.appEvent('zoom-out', 2);
|
||||
});
|
||||
|
||||
this.bind('ctrl+z', () => {
|
||||
scope.appEvent('zoom-out', 2);
|
||||
});
|
||||
|
||||
this.bind('t left', () => {
|
||||
|
||||
@@ -102,6 +102,9 @@ export default class TimeSeries {
|
||||
this.stats.min = Number.MAX_VALUE;
|
||||
this.stats.avg = null;
|
||||
this.stats.current = null;
|
||||
this.stats.first = null;
|
||||
this.stats.delta = 0;
|
||||
this.stats.range = null;
|
||||
this.stats.timeStep = Number.MAX_VALUE;
|
||||
this.allIsNull = true;
|
||||
this.allIsZero = true;
|
||||
@@ -112,6 +115,8 @@ export default class TimeSeries {
|
||||
var currentValue;
|
||||
var nonNulls = 0;
|
||||
var previousTime;
|
||||
var previousValue = 0;
|
||||
var previousDeltaUp = true;
|
||||
|
||||
for (var i = 0; i < this.datapoints.length; i++) {
|
||||
currentValue = this.datapoints[i][0];
|
||||
@@ -148,6 +153,24 @@ export default class TimeSeries {
|
||||
if (currentValue < this.stats.min) {
|
||||
this.stats.min = currentValue;
|
||||
}
|
||||
if (this.stats.first === null){
|
||||
this.stats.first = currentValue;
|
||||
}else{
|
||||
if (previousValue > currentValue) { // counter reset
|
||||
previousDeltaUp = false;
|
||||
if (i === this.datapoints.length-1) { // reset on last
|
||||
this.stats.delta += currentValue;
|
||||
}
|
||||
}else{
|
||||
if (previousDeltaUp) {
|
||||
this.stats.delta += currentValue - previousValue; // normal increment
|
||||
} else {
|
||||
this.stats.delta += currentValue; // account for counter reset
|
||||
}
|
||||
previousDeltaUp = true;
|
||||
}
|
||||
}
|
||||
previousValue = currentValue;
|
||||
}
|
||||
|
||||
if (currentValue !== 0) {
|
||||
@@ -167,6 +190,9 @@ export default class TimeSeries {
|
||||
this.stats.current = result[result.length-2][1];
|
||||
}
|
||||
}
|
||||
if (this.stats.max !== null && this.stats.min !== null) {
|
||||
this.stats.range = this.stats.max - this.stats.min;
|
||||
}
|
||||
|
||||
this.stats.count = result.length;
|
||||
return result;
|
||||
|
||||
@@ -38,10 +38,10 @@ var rangeOptions = [
|
||||
{ from: 'now-12h', to: 'now', display: 'Last 12 hours', section: 3 },
|
||||
{ from: 'now-24h', to: 'now', display: 'Last 24 hours', section: 3 },
|
||||
|
||||
{ from: 'now-2d', to: 'now', display: 'Last 2 days', section: 0 },
|
||||
{ from: 'now-7d', to: 'now', display: 'Last 7 days', section: 0 },
|
||||
{ from: 'now-30d', to: 'now', display: 'Last 30 days', section: 0 },
|
||||
{ from: 'now-60d', to: 'now', display: 'Last 60 days', section: 0 },
|
||||
{ from: 'now-90d', to: 'now', display: 'Last 90 days', section: 0 },
|
||||
{ from: 'now-6M', to: 'now', display: 'Last 6 months', section: 0 },
|
||||
{ from: 'now-1y', to: 'now', display: 'Last 1 year', section: 0 },
|
||||
{ from: 'now-2y', to: 'now', display: 'Last 2 years', section: 0 },
|
||||
|
||||
@@ -91,8 +91,10 @@ export class AlertTabCtrl {
|
||||
switch (type) {
|
||||
case "email": return "fa fa-envelope";
|
||||
case "slack": return "fa fa-slack";
|
||||
case "victorops": return "fa fa-pagelines";
|
||||
case "webhook": return "fa fa-cubes";
|
||||
case "pagerduty": return "fa fa-bullhorn";
|
||||
case "opsgenie": return "fa fa-bell";
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-12">Type</span>
|
||||
<div class="gf-form-select-wrapper width-15">
|
||||
<select class="gf-form-input" ng-model="ctrl.model.type" ng-options="t for t in ['webhook', 'email', 'slack', 'pagerduty']" ng-change="ctrl.typeChanged(notification, $index)">
|
||||
<select class="gf-form-input" ng-model="ctrl.model.type" ng-options="t for t in ['webhook', 'email', 'slack', 'pagerduty', 'victorops', 'opsgenie']" ng-change="ctrl.typeChanged(notification, $index)">
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
@@ -87,6 +87,14 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group" ng-if="ctrl.model.type === 'victorops'">
|
||||
<h3 class="page-heading">VictorOps settings</h3>
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-6">Url</span>
|
||||
<input type="text" required class="gf-form-input max-width-30" ng-model="ctrl.model.settings.url" placeholder="Victorops url"></input>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group section" ng-if="ctrl.model.type === 'email'">
|
||||
<h3 class="page-heading">Email addresses</h3>
|
||||
<div class="gf-form">
|
||||
@@ -114,6 +122,23 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group" ng-if="ctrl.model.type === 'opsgenie'">
|
||||
<h3 class="page-heading">OpsGenie settings</h3>
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-14">API Key</span>
|
||||
<input type="text" required class="gf-form-input max-width-22" ng-model="ctrl.model.settings.apiKey" placeholder="OpsGenie API Key"></input>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<gf-form-switch
|
||||
class="gf-form"
|
||||
label="Auto close incidents"
|
||||
label-class="width-14"
|
||||
checked="ctrl.model.settings.autoClose"
|
||||
tooltip="Automatically close alerts in OpseGenie once the alert goes back to ok.">
|
||||
</gf-form-switch>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group">
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form width-6">
|
||||
|
||||
@@ -13,7 +13,6 @@ define([
|
||||
'./unsavedChangesSrv',
|
||||
'./timepicker/timepicker',
|
||||
'./graphiteImportCtrl',
|
||||
'./importCtrl',
|
||||
'./impression_store',
|
||||
'./upload',
|
||||
'./import/dash_import',
|
||||
|
||||
@@ -14,6 +14,18 @@
|
||||
</span>
|
||||
</a>
|
||||
|
||||
<ul class="nav dash-playlist-actions" ng-if="playlistSrv">
|
||||
<li>
|
||||
<a ng-click="playlistSrv.prev()"><i class="fa fa-step-backward"></i></a>
|
||||
</li>
|
||||
<li>
|
||||
<a ng-click="playlistSrv.stop()"><i class="fa fa-stop"></i></a>
|
||||
</li>
|
||||
<li>
|
||||
<a ng-click="playlistSrv.next()"><i class="fa fa-step-forward"></i></a>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<ul class="nav pull-left dashnav-action-icons">
|
||||
<li ng-show="::dashboardMeta.canStar">
|
||||
<a class="pointer" ng-click="starDashboard()">
|
||||
@@ -68,18 +80,6 @@
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<ul class="nav dash-playlist-actions" ng-if="playlistSrv">
|
||||
<li>
|
||||
<a ng-click="playlistSrv.prev()"><i class="fa fa-step-backward"></i></a>
|
||||
</li>
|
||||
<li>
|
||||
<a ng-click="playlistSrv.stop()"><i class="fa fa-stop"></i></a>
|
||||
</li>
|
||||
<li>
|
||||
<a ng-click="playlistSrv.next()"><i class="fa fa-step-forward"></i></a>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<ul class="nav pull-right">
|
||||
<li ng-show="dashboard.meta.fullscreen" class="dashnav-back-to-dashboard">
|
||||
<a ng-click="exitFullscreen()">
|
||||
|
||||
@@ -123,11 +123,11 @@
|
||||
</div>
|
||||
|
||||
<div class="gf-form-button-row">
|
||||
<button type="button" class="btn gf-form-btn btn-success width-10" ng-click="ctrl.saveDashboard()" ng-hide="ctrl.nameExists" ng-disabled="!ctrl.inputsValid">
|
||||
<i class="fa fa-save"></i> Save & Open
|
||||
<button type="button" class="btn gf-form-btn btn-success width-12" ng-click="ctrl.saveDashboard()" ng-hide="ctrl.nameExists" ng-disabled="!ctrl.inputsValid">
|
||||
<i class="fa fa-save"></i> Import
|
||||
</button>
|
||||
<button type="button" class="btn gf-form-btn btn-danger width-10" ng-click="ctrl.saveDashboard()" ng-show="ctrl.nameExists" ng-disabled="!ctrl.inputsValid">
|
||||
<i class="fa fa-save"></i> Overwrite & Open
|
||||
<button type="button" class="btn gf-form-btn btn-danger width-12" ng-click="ctrl.saveDashboard()" ng-show="ctrl.nameExists" ng-disabled="!ctrl.inputsValid">
|
||||
<i class="fa fa-save"></i> Import (Overwrite)
|
||||
</button>
|
||||
<a class="btn btn-link" ng-click="dismiss()">Cancel</a>
|
||||
<a class="btn btn-link" ng-click="ctrl.back()">Back</a>
|
||||
|
||||
@@ -1,81 +0,0 @@
|
||||
define([
|
||||
'angular',
|
||||
'lodash',
|
||||
],
|
||||
function (angular, _) {
|
||||
'use strict';
|
||||
|
||||
var module = angular.module('grafana.controllers');
|
||||
|
||||
module.controller('DashboardImportCtrl', function($scope, $http, backendSrv, datasourceSrv) {
|
||||
|
||||
$scope.init = function() {
|
||||
$scope.datasources = [];
|
||||
$scope.sourceName = 'grafana';
|
||||
$scope.destName = 'grafana';
|
||||
$scope.imported = [];
|
||||
$scope.dashboards = [];
|
||||
$scope.infoText = '';
|
||||
$scope.importing = false;
|
||||
|
||||
_.each(datasourceSrv.getAll(), function(ds, key) {
|
||||
if (ds.type === 'influxdb_08' || ds.type === 'elasticsearch') {
|
||||
$scope.sourceName = key;
|
||||
$scope.datasources.push(key);
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
$scope.startImport = function() {
|
||||
datasourceSrv.get($scope.sourceName).then(function(ds) {
|
||||
$scope.dashboardSource = ds;
|
||||
$scope.dashboardSource.searchDashboards('title:').then(function(results) {
|
||||
$scope.dashboards = results.dashboards;
|
||||
|
||||
if ($scope.dashboards.length === 0) {
|
||||
$scope.infoText = 'No dashboards found';
|
||||
return;
|
||||
}
|
||||
|
||||
$scope.importing = true;
|
||||
$scope.imported = [];
|
||||
$scope.next();
|
||||
}, function(err) {
|
||||
var resp = err.message || err.statusText || 'Unknown error';
|
||||
var message = "Failed to load dashboards from selected data source, response from server was: " + resp;
|
||||
$scope.appEvent('alert-error', ['Import failed', message]);
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
$scope.next = function() {
|
||||
if ($scope.dashboards.length === 0) {
|
||||
$scope.infoText = "Done! Imported " + $scope.imported.length + " dashboards";
|
||||
}
|
||||
|
||||
var dash = $scope.dashboards.shift();
|
||||
if (!dash.title) {
|
||||
console.log(dash);
|
||||
return;
|
||||
}
|
||||
|
||||
var infoObj = {name: dash.title, info: 'Importing...'};
|
||||
$scope.imported.push(infoObj);
|
||||
$scope.infoText = "Importing " + $scope.imported.length + '/' + ($scope.imported.length + $scope.dashboards.length);
|
||||
|
||||
$scope.dashboardSource.getDashboard(dash.id).then(function(loadedDash) {
|
||||
backendSrv.saveDashboard(loadedDash).then(function() {
|
||||
infoObj.info = "Done!";
|
||||
$scope.next();
|
||||
}, function(err) {
|
||||
err.isHandled = true;
|
||||
infoObj.info = "Error: " + (err.data || { message: 'Unknown' }).message;
|
||||
$scope.next();
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
$scope.init();
|
||||
|
||||
});
|
||||
});
|
||||
@@ -1,44 +0,0 @@
|
||||
<navbar title="Migrate" title-url="dashboards/migrate" icon="fa fa-download">
|
||||
</navbar>
|
||||
|
||||
<div class="page-container">
|
||||
<div class="page-header">
|
||||
<h1>
|
||||
Migrate dashboards
|
||||
</h1>
|
||||
</div>
|
||||
|
||||
<h5 class="section-heading">
|
||||
Import dashboards from Elasticsearch or InfluxDB
|
||||
</h5>
|
||||
|
||||
<div class="gf-form-inline gf-form-group">
|
||||
<div class="gf-form">
|
||||
<div class="gf-form-label">Dashboard source</div>
|
||||
<div class="gf-form-select-wrapper">
|
||||
<select class="gf-form-input gf-size-auto" ng-model="sourceName" ng-options="f for f in datasources"></select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<button class="btn btn-success gf-form-btn" ng-click="startImport()">Import</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<h5 class="section-heading" ng-if="importing">{{infoText}}</h5>
|
||||
<div class="editor-row" ng-if="importing">
|
||||
<div class="editor-row row">
|
||||
<table class="grafana-options-table span5">
|
||||
<tr ng-repeat="dash in imported">
|
||||
<td>{{dash.name}}</td>
|
||||
<td>
|
||||
{{dash.info}}
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div ng-include="'public/app/features/dashboard/partials/graphiteImport.html'"></div>
|
||||
|
||||
</div>
|
||||
|
||||
@@ -57,13 +57,13 @@
|
||||
</gf-form-switch>
|
||||
<gf-form-switch class="gf-form"
|
||||
label="Hide Controls"
|
||||
tooltip="Hide row controls. Shortcut: CTRL+H"
|
||||
tooltip="Hide row controls. Shortcut: CTRL+H or CMD+H"
|
||||
checked="dashboard.hideControls"
|
||||
label-class="width-11">
|
||||
</gf-form-switch>
|
||||
<gf-form-switch class="gf-form"
|
||||
label="Shared Crosshair"
|
||||
tooltip="Shared Crosshair line on all graphs. Shortcut: CTRL+O"
|
||||
label="Shared Tooltip"
|
||||
tooltip="Shared Tooltip on all graphs. Shortcut: CTRL+O or CMD+O"
|
||||
checked="dashboard.sharedCrosshair"
|
||||
label-class="width-11">
|
||||
</gf-form-switch>
|
||||
|
||||
@@ -44,7 +44,7 @@ export class DashRowCtrl {
|
||||
if (dropTarget) {
|
||||
dropTarget = this.dashboard.getPanelInfoById(dropTarget.id);
|
||||
// if draging new panel onto existing panel split it
|
||||
if (dragObject.isNew) {
|
||||
if (dragObject.panel.isNew) {
|
||||
dragObject.panel.span = dropTarget.panel.span = dropTarget.panel.span/2;
|
||||
// insert after
|
||||
dropTarget.row.panels.splice(dropTarget.index+1, 0, dragObject.panel);
|
||||
|
||||
@@ -86,10 +86,18 @@ export class DashboardRow {
|
||||
|
||||
removePanel(panel, ask?) {
|
||||
if (ask !== false) {
|
||||
var text2, confirmText;
|
||||
if (panel.alert) {
|
||||
text2 = "Panel includes an alert rule, removing panel will also remove alert rule";
|
||||
confirmText = "YES";
|
||||
}
|
||||
|
||||
appEvents.emit('confirm-modal', {
|
||||
title: 'Remove Panel',
|
||||
text: 'Are you sure you want to remove this panel?',
|
||||
text2: text2,
|
||||
icon: 'fa-trash',
|
||||
confirmText: confirmText,
|
||||
yesText: 'Remove',
|
||||
onConfirm: () => {
|
||||
this.removePanel(panel, false);
|
||||
|
||||
@@ -110,6 +110,7 @@ function(angular, _) {
|
||||
_.each(dash.templating.list, function(value) {
|
||||
value.current = null;
|
||||
value.options = null;
|
||||
value.filters = null;
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
///<reference path="../../headers/common.d.ts" />
|
||||
|
||||
import angular from 'angular';
|
||||
import config from 'app/core/config';
|
||||
import coreModule from '../../core/core_module';
|
||||
import kbn from 'app/core/utils/kbn';
|
||||
|
||||
@@ -11,6 +10,7 @@ class PlaylistSrv {
|
||||
private index: number;
|
||||
private interval: any;
|
||||
private playlistId: number;
|
||||
private startUrl: string;
|
||||
|
||||
/** @ngInject */
|
||||
constructor(private $rootScope: any, private $location: any, private $timeout: any, private backendSrv: any) { }
|
||||
@@ -21,7 +21,7 @@ class PlaylistSrv {
|
||||
var playedAllDashboards = this.index > this.dashboards.length - 1;
|
||||
|
||||
if (playedAllDashboards) {
|
||||
window.location.href = `${config.appSubUrl}/playlists/play/${this.playlistId}`;
|
||||
window.location.href = this.startUrl;
|
||||
} else {
|
||||
var dash = this.dashboards[this.index];
|
||||
this.$location.url('dashboard/' + dash.uri);
|
||||
@@ -39,6 +39,7 @@ class PlaylistSrv {
|
||||
start(playlistId) {
|
||||
this.stop();
|
||||
|
||||
this.startUrl = window.location.href;
|
||||
this.index = 0;
|
||||
this.playlistId = playlistId;
|
||||
this.$rootScope.playlistSrv = this;
|
||||
|
||||
@@ -80,7 +80,6 @@ export class DataSourceEditCtrl {
|
||||
this.backendSrv.get('/api/datasources/' + id).then(ds => {
|
||||
this.isNew = false;
|
||||
this.current = ds;
|
||||
|
||||
if (datasourceCreated) {
|
||||
datasourceCreated = false;
|
||||
this.testDatasource();
|
||||
|
||||
@@ -1,56 +1,69 @@
|
||||
|
||||
|
||||
<div class="gf-form-group">
|
||||
<h3 class="page-heading">Http settings</h3>
|
||||
<h3 class="page-heading">Http settings</h3>
|
||||
<div class="gf-form-group">
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form max-width-30">
|
||||
<span class="gf-form-label width-7">Url</span>
|
||||
<input class="gf-form-input" type="text"
|
||||
ng-model='current.url' placeholder="{{suggestUrl}}"
|
||||
bs-typeahead="getSuggestUrls" min-length="0"
|
||||
ng-pattern="/^(ftp|http|https):\/\/(\w+:{0,1}\w*@)?(\S+)(:[0-9]+)?(\/|\/([\w#!:.?+=&%@!\-\/]))?$/" required></input>
|
||||
<info-popover mode="right-absolute">
|
||||
<p>Specify a complete HTTP url (for example http://your_server:8080)</p>
|
||||
<span ng-show="current.access === 'direct'">
|
||||
Your access method is <em>Direct</em>, this means the url
|
||||
needs to be accessable from the browser.
|
||||
</span>
|
||||
<span ng-show="current.access === 'proxy'">
|
||||
Your access method is currently <em>Proxy</em>, this means the url
|
||||
needs to be accessable from the grafana backend.
|
||||
</span>
|
||||
</info-popover>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form max-width-30">
|
||||
<span class="gf-form-label width-7">Access</span>
|
||||
<div class="gf-form-select-wrapper gf-form-select-wrapper--has-help-icon max-width-24">
|
||||
<select class="gf-form-input" ng-model="current.access" ng-options="f for f in ['direct', 'proxy']"></select>
|
||||
<info-popover mode="right-absolute">
|
||||
Direct = url is used directly from browser<br>
|
||||
Proxy = Grafana backend will proxy the request
|
||||
</info-popover>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<h3 class="page-heading">Http Auth</h3>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form max-width-30">
|
||||
<span class="gf-form-label width-7">Url</span>
|
||||
<input class="gf-form-input" type="text"
|
||||
ng-model='current.url' placeholder="{{suggestUrl}}"
|
||||
bs-typeahead="getSuggestUrls" min-length="0"
|
||||
ng-pattern="/^(ftp|http|https):\/\/(\w+:{0,1}\w*@)?(\S+)(:[0-9]+)?(\/|\/([\w#!:.?+=&%@!\-\/]))?$/" required></input>
|
||||
<info-popover mode="right-absolute">
|
||||
<p>Specify a complete HTTP url (for example http://your_server:8080)</p>
|
||||
<span ng-show="current.access === 'direct'">
|
||||
Your access method is <em>Direct</em>, this means the url
|
||||
needs to be accessable from the browser.
|
||||
</span>
|
||||
<span ng-show="current.access === 'proxy'">
|
||||
Your access method is currently <em>Proxy</em>, this means the url
|
||||
needs to be accessable from the grafana backend.
|
||||
</span>
|
||||
</info-popover>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form max-width-30">
|
||||
<span class="gf-form-label width-7">Access</span>
|
||||
<div class="gf-form-select-wrapper gf-form-select-wrapper--has-help-icon max-width-24">
|
||||
<select class="gf-form-input" ng-model="current.access" ng-options="f for f in ['direct', 'proxy']"></select>
|
||||
<info-popover mode="right-absolute">
|
||||
Direct = url is used directly from browser<br>
|
||||
Proxy = Grafana backend will proxy the request
|
||||
</info-popover>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form">
|
||||
<label class="gf-form-label width-7">Http Auth</label>
|
||||
</div>
|
||||
<gf-form-switch class="gf-form"
|
||||
label="Basic Auth"
|
||||
checked="current.basicAuth" switch-class="max-width-6">
|
||||
checked="current.basicAuth" label-class="width-8" switch-class="max-width-6">
|
||||
</gf-form-switch>
|
||||
<gf-form-switch class="gf-form"
|
||||
label="With Credentials"
|
||||
checked="current.withCredentials" switch-class="max-width-6">
|
||||
label="With Credentials" tooltip="Whether credentials such as cookies or auth headers should be sent with cross-site requests."
|
||||
checked="current.withCredentials" label-class="width-11" switch-class="max-width-6">
|
||||
</gf-form-switch>
|
||||
</div>
|
||||
<div class="gf-form-inline">
|
||||
<gf-form-switch class="gf-form" ng-if="current.access=='proxy'"
|
||||
label="TLS Client Auth" label-class="width-8"
|
||||
checked="current.jsonData.tlsAuth" switch-class="max-width-6">
|
||||
</gf-form-switch>
|
||||
<gf-form-switch class="gf-form" ng-if="current.access=='proxy'"
|
||||
label="With CA Cert" tooltip="Optional. Needed for self-signed TLS Certs."
|
||||
checked="current.jsonData.tlsAuthWithCACert" label-class="width-11" switch-class="max-width-6">
|
||||
</gf-form-switch>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group" ng-if="current.basicAuth">
|
||||
<h6>Basic Auth Details</h6>
|
||||
<div class="gf-form" ng-if="current.basicAuth">
|
||||
<span class="gf-form-label width-7">
|
||||
User
|
||||
@@ -58,7 +71,7 @@
|
||||
<input class="gf-form-input max-width-21" type="text" ng-model='current.basicAuthUser' placeholder="user" required></input>
|
||||
</div>
|
||||
|
||||
<div class="gf-form" ng-if="current.basicAuth">
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-7">
|
||||
Password
|
||||
</span>
|
||||
@@ -66,3 +79,51 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-group" ng-if="current.jsonData.tlsAuth && current.access=='proxy'">
|
||||
<div class="gf-form">
|
||||
<h6>TLS Auth Details</h6>
|
||||
<info-popover mode="header">TLS Certs are encrypted and stored in the Grafana database.</info-popover>
|
||||
</div>
|
||||
<div ng-if="current.jsonData.tlsAuthWithCACert">
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form gf-form--v-stretch">
|
||||
<label class="gf-form-label width-7">CA Cert</label>
|
||||
</div>
|
||||
<div class="gf-form gf-form--grow" ng-if="!current.tlsAuth.tlsCACertSet">
|
||||
<textarea rows="7" class="gf-form-input gf-form-textarea" ng-model="current.secureJsonData.tlsCACert" placeholder="Begins with -----BEGIN CERTIFICATE-----. The CA Certificate is necessary if you are using self-signed certificates."></textarea>
|
||||
</div>
|
||||
|
||||
<div class="gf-form" ng-if="current.tlsAuth.tlsCACertSet">
|
||||
<input type="text" class="gf-form-input max-width-12" disabled="disabled" value="configured">
|
||||
<a class="btn btn-secondary gf-form-btn" href="#" ng-if="current.tlsAuth.tlsCACertSet" ng-click="current.tlsAuth.tlsCACertSet = false">reset</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form gf-form--v-stretch">
|
||||
<label class="gf-form-label width-7">Client Cert</label>
|
||||
</div>
|
||||
<div class="gf-form gf-form--grow" ng-if="!current.tlsAuth.tlsClientCertSet">
|
||||
<textarea rows="7" class="gf-form-input gf-form-textarea" ng-model="current.secureJsonData.tlsClientCert" placeholder="Begins with -----BEGIN CERTIFICATE-----" required></textarea>
|
||||
</div>
|
||||
<div class="gf-form" ng-if="current.tlsAuth.tlsClientCertSet">
|
||||
<input type="text" class="gf-form-input max-width-12" disabled="disabled" value="configured">
|
||||
<a class="btn btn-secondary gf-form-btn" href="#" ng-if="current.tlsAuth.tlsClientCertSet" ng-click="current.tlsAuth.tlsClientCertSet = false">reset</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form gf-form--v-stretch">
|
||||
<label class="gf-form-label width-7">Client Key</label>
|
||||
</div>
|
||||
<div class="gf-form gf-form--grow" ng-if="!current.tlsAuth.tlsClientKeySet">
|
||||
<textarea rows="7" class="gf-form-input gf-form-textarea" ng-model="current.secureJsonData.tlsClientKey" placeholder="Begins with -----BEGIN RSA PRIVATE KEY-----" required></textarea>
|
||||
</div>
|
||||
<div class="gf-form" ng-if="current.tlsAuth.tlsClientKeySet">
|
||||
<input type="text" class="gf-form-input max-width-12" disabled="disabled" value="configured">
|
||||
<a class="btn btn-secondary gf-form-btn" href="#" ng-if="current.tlsAuth.tlsClientKeySet" ng-click="current.tlsAuth.tlsClientKeySet = false">reset</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -25,6 +25,7 @@ function (angular, _, kbn) {
|
||||
this.updateTemplateData = function() {
|
||||
this._index = {};
|
||||
this._filters = {};
|
||||
this._adhocVariables = {};
|
||||
|
||||
for (var i = 0; i < this.variables.length; i++) {
|
||||
var variable = this.variables[i];
|
||||
@@ -111,10 +112,18 @@ function (angular, _, kbn) {
|
||||
this._grafanaVariables[name] = value;
|
||||
};
|
||||
|
||||
this.variableExists = function(expression) {
|
||||
this.getVariableName = function(expression) {
|
||||
this._regex.lastIndex = 0;
|
||||
var match = this._regex.exec(expression);
|
||||
return match && (self._index[match[1] || match[2]] !== void 0);
|
||||
if (!match) {
|
||||
return null;
|
||||
}
|
||||
return match[1] || match[2];
|
||||
};
|
||||
|
||||
this.variableExists = function(expression) {
|
||||
var name = this.getVariableName(expression);
|
||||
return name && (self._index[name] !== void 0);
|
||||
};
|
||||
|
||||
this.highlightVariablesAsHtml = function(str) {
|
||||
|
||||
@@ -15,8 +15,8 @@
|
||||
</div>
|
||||
|
||||
<div ng-show='dashboardMeta.canEdit' class="row-fluid add-row-panel-hint">
|
||||
<div class="span12" style="text-align:right;">
|
||||
<span style="margin-right: 10px;" ng-click="addRowDefault()" class="pointer btn btn-inverse btn-small">
|
||||
<div class="span12" style="text-align:left;">
|
||||
<span style="margin-left: 12px;" ng-click="addRowDefault()" class="pointer btn btn-inverse btn-small">
|
||||
<span><i class="fa fa-plus"></i> ADD ROW</span>
|
||||
</span>
|
||||
</div>
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user