mirror of
https://github.com/grafana/grafana.git
synced 2025-02-25 18:55:37 -06:00
merge with master
This commit is contained in:
commit
cb9c2bcb2f
47
CHANGELOG.md
47
CHANGELOG.md
@ -1,8 +1,51 @@
|
||||
# 4.3.0-stable (unreleased)
|
||||
# 4.4.0 (unreleased)
|
||||
|
||||
## New Features
|
||||
**Dashboard History**: View dashboard version history, compare any two versions (summary & json diffs), restore to old version. This big feature
|
||||
was contributed by **Walmart Labs**. Big thanks to them for this massive contribution!
|
||||
Initial feature request: [#4638](https://github.com/grafana/grafana/issues/4638)
|
||||
Pull Request: [#8472](https://github.com/grafana/grafana/pull/8472)
|
||||
|
||||
## Enhancements
|
||||
* **Elasticsearch**: Added filter aggregation label [#8420](https://github.com/grafana/grafana/pull/8420), thx [@tianzk](github.com/tianzk)
|
||||
* **Sensu**: Added option for source and handler [#8405](https://github.com/grafana/grafana/pull/8405), thx [@joemiller](github.com/joemiller)
|
||||
* **CSV**: Configurable csv export datetime format [#8058](https://github.com/grafana/grafana/issues/8058), thx [@cederigo](github.com/cederigo)
|
||||
|
||||
# 4.3.2 (2017-05-31)
|
||||
|
||||
## Bug fixes
|
||||
|
||||
* **InfluxDB**: Fixed issue with query editor not showing ALIAS BY input field when in text editor mode [#8459](https://github.com/grafana/grafana/issues/8459)
|
||||
* **Graph Log Scale**: Fixed issue with log scale going below x-axis [#8244](https://github.com/grafana/grafana/issues/8244)
|
||||
* **Playlist**: Fixed dashboard play order issue [#7688](https://github.com/grafana/grafana/issues/7688)
|
||||
* **Elasticsearch**: Fixed table query issue with ES 2.x [#8467](https://github.com/grafana/grafana/issues/8467), thx [@goldeelox](https://github.com/goldeelox)
|
||||
|
||||
## Changes
|
||||
* **Lazy Loading Of Panels**: Panels are no longer loaded as they are scrolled into view, this was reverted due to Chrome bug, might be reintroduced when Chrome fixes it's JS blocking behavior on scroll. [#8500](https://github.com/grafana/grafana/issues/8500)
|
||||
|
||||
# 4.3.1 (2017-05-23)
|
||||
|
||||
## Bug fixes
|
||||
|
||||
* **S3 image upload**: Fixed image url issue for us-east-1 (us standard) region. If you were missing slack images for alert notifications this should fix it. [#8444](https://github.com/grafana/grafana/issues/8444)
|
||||
|
||||
# 4.3.0-stable (2017-05-23)
|
||||
|
||||
## Bug fixes
|
||||
|
||||
* **Gzip**: Fixed crash when gzip was enabled [#8380](https://github.com/grafana/grafana/issues/8380)
|
||||
* **Graphite**: Fixed issue with Toggle edit mode did in query editor [#8377](https://github.com/grafana/grafana/issues/8377)
|
||||
* **Alerting**: Fixed issue with state history not showing query execution errors [#8412](https://github.com/grafana/grafana/issues/8412)
|
||||
* **Alerting**: Fixed issue with missing state history events/annotations when using sqlite3 database [#7992](https://github.com/grafana/grafana/issues/7992)
|
||||
* **Sqlite**: Fixed with database table locked and using sqlite3 database [#7992](https://github.com/grafana/grafana/issues/7992)
|
||||
* **Alerting**: Fixed issue with annotations showing up in unsaved dashboards, new graph & alert panel. [#8361](https://github.com/grafana/grafana/issues/8361)
|
||||
* **webdav**: Fixed http proxy env variable support for webdav image upload [#7922](https://github.com/grafana/grafana/issues/79222), thx [@berghauz](https://github.com/berghauz)
|
||||
* **Prometheus**: Fixed issue with hiding query [#8413](https://github.com/grafana/grafana/issues/8413)
|
||||
|
||||
## Enhancements
|
||||
|
||||
* **VictorOps**: Now supports panel image & auto resolve [#8431](https://github.com/grafana/grafana/pull/8431), thx [@davidmscott](https://github.com/davidmscott)
|
||||
* **Alerting**: Alert annotations now provide more info [#8421](https://github.com/grafana/grafana/pull/8421)
|
||||
|
||||
# 4.3.0-beta1 (2017-05-12)
|
||||
|
||||
@ -18,7 +61,7 @@
|
||||
* **Heatmap**: Heatmap Panel [#7934](https://github.com/grafana/grafana/pull/7934)
|
||||
* **Elasticsearch**: histogram aggregation [#3164](https://github.com/grafana/grafana/issues/3164)
|
||||
|
||||
## Minor Enchancements
|
||||
## Minor Enhancements
|
||||
|
||||
* **InfluxDB**: Small fix for the "glow" when focus the field for LIMIT and SLIMIT [#7799](https://github.com/grafana/grafana/pull/7799) thx [@thuck](https://github.com/thuck)
|
||||
* **Prometheus**: Make Prometheus query field a textarea [#7663](https://github.com/grafana/grafana/issues/7663), thx [@hagen1778](https://github.com/hagen1778)
|
||||
|
@ -1,4 +1,4 @@
|
||||
Copyright 2014-2016 Torkel Ödegaard, Raintank Inc.
|
||||
Copyright 2014-2017 Torkel Ödegaard, Raintank Inc.
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you
|
||||
may not use this file except in compliance with the License. You may
|
||||
|
@ -146,8 +146,7 @@ Create a custom.ini in the conf directory to override default configuration opti
|
||||
You only need to add the options you want to override. Config files are applied in the order of:
|
||||
|
||||
1. grafana.ini
|
||||
2. dev.ini (if found)
|
||||
3. custom.ini
|
||||
1. custom.ini
|
||||
|
||||
## Create a pull request
|
||||
Before or after you create a pull request, sign the [contributor license agreement](http://docs.grafana.org/project/cla/).
|
||||
|
2
build.go
2
build.go
@ -235,7 +235,7 @@ func createRpmPackages() {
|
||||
defaultFileSrc: "packaging/rpm/sysconfig/grafana-server",
|
||||
systemdFileSrc: "packaging/rpm/systemd/grafana-server.service",
|
||||
|
||||
depends: []string{"/sbin/service", "fontconfig"},
|
||||
depends: []string{"/sbin/service", "fontconfig", "freetype", "urw-fonts"},
|
||||
})
|
||||
}
|
||||
|
||||
|
@ -249,6 +249,7 @@ allowed_domains =
|
||||
hosted_domain =
|
||||
|
||||
#################################### Grafana.com Auth ####################
|
||||
# legacy key names (so they work in env variables)
|
||||
[auth.grafananet]
|
||||
enabled = false
|
||||
allow_sign_up = true
|
||||
@ -257,6 +258,14 @@ client_secret = some_secret
|
||||
scopes = user:email
|
||||
allowed_organizations =
|
||||
|
||||
[auth.grafana_com]
|
||||
enabled = false
|
||||
allow_sign_up = true
|
||||
client_id = some_id
|
||||
client_secret = some_secret
|
||||
scopes = user:email
|
||||
allowed_organizations =
|
||||
|
||||
#################################### Generic OAuth #######################
|
||||
[auth.generic_oauth]
|
||||
name = OAuth
|
||||
@ -433,6 +442,9 @@ prefix = prod.grafana.%(instance_name)s.
|
||||
[grafana_net]
|
||||
url = https://grafana.com
|
||||
|
||||
[grafana_com]
|
||||
url = https://grafana.com
|
||||
|
||||
#################################### External Image Storage ##############
|
||||
[external_image_storage]
|
||||
# You can choose between (s3, webdav)
|
||||
|
@ -249,7 +249,7 @@
|
||||
;allowed_organizations =
|
||||
|
||||
#################################### Grafana.com Auth ####################
|
||||
[auth.grafananet]
|
||||
[auth.grafana_com]
|
||||
;enabled = false
|
||||
;allow_sign_up = true
|
||||
;client_id = some_id
|
||||
@ -298,7 +298,7 @@
|
||||
# Use space to separate multiple modes, e.g. "console file"
|
||||
;mode = console file
|
||||
|
||||
# Either "trace", "debug", "info", "warn", "error", "critical", default is "info"
|
||||
# Either "debug", "info", "warn", "error", "critical", default is "info"
|
||||
;level = info
|
||||
|
||||
# optional settings to set different levels for specific loggers. Ex filters = sqlstore:debug
|
||||
@ -386,7 +386,7 @@
|
||||
|
||||
#################################### Grafana.com integration ##########################
|
||||
# Url used to to import dashboards directly from Grafana.com
|
||||
[grafana_net]
|
||||
[grafana_com]
|
||||
;url = https://grafana.com
|
||||
|
||||
#################################### External image storage ##########################
|
||||
|
@ -4,7 +4,6 @@ graphite:
|
||||
- "8080:80"
|
||||
- "2003:2003"
|
||||
volumes:
|
||||
- /var/docker/gfdev/graphite:/opt/graphite/storage/whisper
|
||||
- /etc/localtime:/etc/localtime:ro
|
||||
- /etc/timezone:/etc/timezone:ro
|
||||
|
||||
|
@ -1,8 +1,7 @@
|
||||
# Building The Docs
|
||||
|
||||
To build the docs locally, you need to have docker installed. The
|
||||
docs are built using a custom [docker](https://www.docker.com/) image
|
||||
and the [mkdocs](http://www.mkdocs.org/) tool.
|
||||
docs are built using [Hugo](http://gohugo.io/) - a static site generator.
|
||||
|
||||
**Prepare the Docker Image**:
|
||||
|
||||
@ -11,19 +10,40 @@ when running ``make docs-build`` depending on how your system's docker
|
||||
service is configured):
|
||||
|
||||
```
|
||||
$ git clone https://github.com/grafana/grafana.org
|
||||
$ cd grafana.org
|
||||
$ make docs-build
|
||||
git clone https://github.com/grafana/grafana.org
|
||||
cd grafana.org
|
||||
make docs-build
|
||||
```
|
||||
|
||||
**Build the Documentation**:
|
||||
|
||||
Now that the docker image has been prepared we can build the
|
||||
docs. Switch your working directory back to the directory this file
|
||||
(README.md) is in and run (possibly with ``sudo``):
|
||||
grafana docs and start a docs server.
|
||||
|
||||
If you have not cloned the Grafana repository already then:
|
||||
|
||||
```
|
||||
$ make docs
|
||||
cd ..
|
||||
git clone https://github.com/grafana/grafana
|
||||
```
|
||||
|
||||
Switch your working directory to the directory this file
|
||||
(README.md) is in.
|
||||
|
||||
```
|
||||
cd grafana/docs
|
||||
```
|
||||
|
||||
An AWS config file is required to build the docs Docker image and to publish the site to AWS. If you are building locally only and do not have any AWS credentials for docs.grafana.org then create an empty file named `awsconfig` in the current directory.
|
||||
|
||||
```
|
||||
touch awsconfig
|
||||
```
|
||||
|
||||
Then run (possibly with ``sudo``):
|
||||
|
||||
```
|
||||
make watch
|
||||
```
|
||||
|
||||
This command will not return control of the shell to the user. Instead
|
||||
@ -32,4 +52,21 @@ we created in the previous step.
|
||||
|
||||
Open [localhost:3004](http://localhost:3004) to view the docs.
|
||||
|
||||
### Images & Content
|
||||
|
||||
All markdown files are located in this repo (main grafana repo). But all images are added to the https://github.com/grafana/grafana.org repo. So the process of adding images is a bit complicated.
|
||||
|
||||
First you need create a feature (PR) branch of https://github.com/grafana/grafana.org so you can make change. Then add the image to the `/static/img/docs` directory. Then make a commit that adds the image.
|
||||
|
||||
Then run:
|
||||
```
|
||||
make docs-build
|
||||
```
|
||||
|
||||
This will rebuild the docs docker container.
|
||||
|
||||
To be able to use the image your have to quit (CTRL-C) the `make watch` command (that you run in the same directory as this README). Then simply rerun `make watch`, it will restart the docs server but now with access to your image.
|
||||
|
||||
### Editing content
|
||||
|
||||
Changes to the markdown files should automatically cause a docs rebuild and live reload should reload the page in your browser.
|
||||
|
@ -52,12 +52,22 @@ Here you can specify the name of the alert rule and how often the scheduler shou
|
||||
### Conditions
|
||||
|
||||
Currently the only condition type that exists is a `Query` condition that allows you to
|
||||
specify a query letter, time range and an aggregation function. The letter refers to
|
||||
a query you already have added in the **Metrics** tab. The result from the query and the aggregation function is
|
||||
a single value that is then used in the threshold check. The query used in an alert rule cannot
|
||||
contain any template variables. Currently we only support `AND` and `OR` operators between conditions and they are executed serially.
|
||||
specify a query letter, time range and an aggregation function.
|
||||
|
||||
|
||||
### Query condition example
|
||||
|
||||
```sql
|
||||
avg() OF query(A, 5m, now) IS BELOW 14
|
||||
```
|
||||
|
||||
- `avg()` Controls how the values for **each** serie should be reduced to a value that can be compared against the threshold. Click on the function to change it to another aggregation function.
|
||||
- `query(A, 5m, now)` The letter defines what query to execute from the **Metrics** tab. The second two parameters defines the time range, `5m, now` means 5 minutes from now to now. You can also do `10m, now-2m` to define a time range that will be 10 minutes from now to 2 minutes from now. This is useful if you want to ignore the last 2 minutes of data.
|
||||
- `IS BELOW 14` Defines the type of threshold and the threshold value. You can click on `IS BELOW` to change the type of threshold.
|
||||
|
||||
The query used in an alert rule cannot contain any template variables. Currently we only support `AND` and `OR` operators between conditions and they are executed serially.
|
||||
For example, we have 3 conditions in the following order:
|
||||
`condition:A(evaluates to: TRUE) OR condition:B(evaluates to: FALSE) AND condition:C(evaluates to: TRUE)`
|
||||
*condition:A(evaluates to: TRUE) OR condition:B(evaluates to: FALSE) AND condition:C(evaluates to: TRUE)*
|
||||
so the result will be calculated as ((TRUE OR FALSE) AND TRUE) = TRUE.
|
||||
|
||||
We plan to add other condition types in the future, like `Other Alert`, where you can include the state
|
||||
|
@ -92,9 +92,10 @@ The Elasticsearch data source supports two types of queries you can use in the *
|
||||
Query | Description
|
||||
------------ | -------------
|
||||
*{"find": "fields", "type": "keyword"} | Returns a list of field names with the index type `keyword`.
|
||||
*{"find": "terms", "field": "@hostname"}* | Returns a list of values for a field using term aggregation. Query will user current dashboard time range as time range for query.
|
||||
*{"find": "terms", "field": "@hostname", "size": 1000}* | Returns a list of values for a field using term aggregation. Query will user current dashboard time range as time range for query.
|
||||
*{"find": "terms", "field": "@hostname", "query": '<lucene query>'}* | Returns a list of values for a field using term aggregation & and a specified lucene query filter. Query will use current dashboard time range as time range for query.
|
||||
|
||||
There is a default size limit of 500 on terms queries. Set the size property in your query to set a custom limit.
|
||||
You can use other variables inside the query. Example query definition for a variable named `$host`.
|
||||
|
||||
```
|
||||
|
@ -237,12 +237,14 @@ Change password for specific user
|
||||
Accept: application/json
|
||||
Content-Type: application/json
|
||||
|
||||
{"password":"userpassword"}
|
||||
|
||||
**Example Response**:
|
||||
|
||||
HTTP/1.1 200
|
||||
Content-Type: application/json
|
||||
|
||||
{"password":"userpassword"}
|
||||
{"message": "User password updated"}
|
||||
|
||||
## Permissions
|
||||
|
||||
@ -254,6 +256,8 @@ Change password for specific user
|
||||
Accept: application/json
|
||||
Content-Type: application/json
|
||||
|
||||
{"isGrafanaAdmin": true}
|
||||
|
||||
**Example Response**:
|
||||
|
||||
HTTP/1.1 200
|
||||
|
@ -52,6 +52,15 @@ parent = "http_api"
|
||||
"expires": 3600
|
||||
}
|
||||
|
||||
JSON Body schema:
|
||||
|
||||
- **dashboard** – Required. The complete dashboard model.
|
||||
- **name** – Optional. snapshot name
|
||||
- **expires** - Optional. When the snapshot should expire in seconds. 3600 is 1 hour, 86400 is 1 day. Default is never to expire.
|
||||
- **external** - Optional. Save the snapshot on an external server rather than locally. Default is `false`.
|
||||
- **key** - Optional. Define the unique key. Required if **external** is `true`.
|
||||
- **deleteKey** - Optional. Unique key used to delete the snapshot. It is different from the **key** so that only the creator can delete the snapshot. Required if **external** is `true`.
|
||||
|
||||
**Example Response**:
|
||||
|
||||
HTTP/1.1 200
|
||||
|
@ -444,20 +444,29 @@ false only pre-existing Grafana users will be able to login (if ldap authenticat
|
||||
<hr>
|
||||
|
||||
## [auth.proxy]
|
||||
|
||||
This feature allows you to handle authentication in a http reverse proxy.
|
||||
|
||||
### enabled
|
||||
|
||||
Defaults to `false`
|
||||
|
||||
### header_name
|
||||
|
||||
Defaults to X-WEBAUTH-USER
|
||||
|
||||
#### header_property
|
||||
|
||||
Defaults to username but can also be set to email
|
||||
|
||||
### auto_sign_up
|
||||
|
||||
Set to `true` to enable auto sign up of users who do not exist in Grafana DB. Defaults to `true`.
|
||||
|
||||
### whitelist
|
||||
|
||||
Limit where auth proxy requests come from by configuring a list of IP addresses. This can be used to prevent users spoofing the X-WEBAUTH-USER header.
|
||||
|
||||
<hr>
|
||||
|
||||
## [session]
|
||||
|
@ -15,8 +15,7 @@ weight = 1
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Stable for Debian-based Linux | [4.2.0 (x86-64 deb)](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.2.0_amd64.deb)
|
||||
Beta for Debian-based Linux | [4.3.0-beta1 (x86-64 deb)](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.3.0-beta1_amd64.deb)
|
||||
Stable for Debian-based Linux | [grafana_4.3.1_amd64.deb](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.3.1_amd64.deb)
|
||||
|
||||
Read [Upgrading Grafana]({{< relref "installation/upgrading.md" >}}) for tips and guidance on updating an existing
|
||||
installation.
|
||||
@ -24,11 +23,12 @@ installation.
|
||||
## Install Stable
|
||||
|
||||
```bash
|
||||
wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.2.0_amd64.deb
|
||||
wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.3.1_amd64.deb
|
||||
sudo apt-get install -y adduser libfontconfig
|
||||
sudo dpkg -i grafana_4.2.0_amd64.deb
|
||||
sudo dpkg -i grafana_4.3.1_amd64.deb
|
||||
```
|
||||
|
||||
<!--
|
||||
## Install Beta
|
||||
|
||||
```bash
|
||||
@ -36,6 +36,7 @@ wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_4.3.0-b
|
||||
sudo apt-get install -y adduser libfontconfig
|
||||
sudo dpkg -i grafana_4.3.0-beta1_amd64.deb
|
||||
```
|
||||
-->
|
||||
|
||||
## APT Repository
|
||||
|
||||
|
@ -15,8 +15,7 @@ weight = 2
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [4.2.0 (x86-64 rpm)](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.2.0-1.x86_64.rpm)
|
||||
Beta for CentOS / Fedora / OpenSuse / Redhat Linux | [4.3.0-beta1 (x86-64 rpm)](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.0-beta1.x86_64.rpm)
|
||||
Stable for CentOS / Fedora / OpenSuse / Redhat Linux | [4.3.1 (x86-64 rpm)](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.1-1.x86_64.rpm)
|
||||
|
||||
Read [Upgrading Grafana]({{< relref "installation/upgrading.md" >}}) for tips and guidance on updating an existing
|
||||
installation.
|
||||
@ -25,19 +24,19 @@ installation.
|
||||
|
||||
You can install Grafana using Yum directly.
|
||||
|
||||
$ sudo yum install https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.2.0-1.x86_64.rpm
|
||||
$ sudo yum install https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.1-1.x86_64.rpm
|
||||
|
||||
Or install manually using `rpm`.
|
||||
|
||||
#### On CentOS / Fedora / Redhat:
|
||||
|
||||
$ wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.2.0-1.x86_64.rpm
|
||||
$ wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.1-1.x86_64.rpm
|
||||
$ sudo yum install initscripts fontconfig
|
||||
$ sudo rpm -Uvh grafana-4.2.0-1.x86_64.rpm
|
||||
$ sudo rpm -Uvh grafana-4.3.1-1.x86_64.rpm
|
||||
|
||||
#### On OpenSuse:
|
||||
|
||||
$ sudo rpm -i --nodeps grafana-4.2.0-1.x86_64.rpm
|
||||
$ sudo rpm -i --nodeps grafana-4.3.1-1.x86_64.rpm
|
||||
|
||||
## Install via YUM Repository
|
||||
|
||||
|
@ -13,8 +13,7 @@ weight = 3
|
||||
|
||||
Description | Download
|
||||
------------ | -------------
|
||||
Latest stable package for Windows | [grafana.4.2.0.windows-x64.zip](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.2.0.windows-x64.zip)
|
||||
Beta package for Windows | [grafana-4.3.0-beta1.windows-x64.zip](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.0-beta1.windows-x64.zip)
|
||||
Latest stable package for Windows | [grafana.4.3.1.windows-x64.zip](https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana-4.3.1.windows-x64.zip)
|
||||
|
||||
Read [Upgrading Grafana]({{< relref "installation/upgrading.md" >}}) for tips and guidance on updating an existing
|
||||
installation.
|
||||
|
@ -99,6 +99,6 @@ To manually install a Plugin via the Grafana.com API:
|
||||
}
|
||||
```
|
||||
|
||||
4. Download the plugin with `https://grafana.com/api/plugins/<plugin id from step 1>/versions/<current version>/download` (for example: https://grafana.com/api/plugins/jdbranham-diagram-panel/versions/1.4.0/download). Unzip the downloaded file into the Grafana Server's `data/plugins` directory.
|
||||
4. Download the plugin with `https://grafana.com/api/plugins/<plugin id from step 1>/versions/<current version>/download` (for example: https://grafana.com/api/plugins/jdbranham-diagram-panel/versions/1.4.0/download). Unzip the downloaded file into the Grafana Server's `plugins` directory.
|
||||
|
||||
5. Restart the Grafana Server.
|
||||
|
@ -65,7 +65,7 @@ Each field in the dashboard JSON is explained below with its usage:
|
||||
| **timezone** | timezone of dashboard, i.e. `utc` or `browser` |
|
||||
| **editable** | whether a dashboard is editable or not |
|
||||
| **hideControls** | whether row controls on the left in green are hidden or not |
|
||||
| **graphTooltip** | TODO |
|
||||
| **graphTooltip** | 0 for no shared crosshair or tooltip (default), 1 for shared crosshair, 2 for shared crosshair AND shared tooltip |
|
||||
| **rows** | row metadata, see [rows section](#rows) for details |
|
||||
| **time** | time range for dashboard, i.e. last 6 hours, last 7 days, etc |
|
||||
| **timepicker** | timepicker metadata, see [timepicker section](#timepicker) for details |
|
||||
|
74
docs/sources/tutorials/api_org_token_howto.md
Normal file
74
docs/sources/tutorials/api_org_token_howto.md
Normal file
@ -0,0 +1,74 @@
|
||||
+++
|
||||
title = "API Tutorial: How To Create API Tokens And Dashboards For A Specific Organization"
|
||||
type = "docs"
|
||||
keywords = ["grafana", "tutorials", "API", "Token", "Org", "Organization"]
|
||||
[menu.docs]
|
||||
parent = "tutorials"
|
||||
weight = 10
|
||||
+++
|
||||
|
||||
# API Tutorial: How To Create API Tokens And Dashboards For A Specific Organization
|
||||
|
||||
A common scenario is to want to via the Grafana API setup new Grafana organizations or to add dynamically generated dashboards to an existing organization.
|
||||
|
||||
## Authentication
|
||||
|
||||
There are two ways to authenticate against the API: basic authentication and API Tokens.
|
||||
|
||||
Some parts of the API are only available through basic authentication and these parts of the API usually require that the user is a Grafana Admin. But all organization actions are accessed via an API Token. An API Token is tied to an organization and can be used to create dashboards etc but only for that organization.
|
||||
|
||||
## How To Create A New Organization and an API Token
|
||||
|
||||
The task is to create a new organization and then add a Token that can be used by other users. In the examples below which use basic auth, the user is `admin` and the password is `admin`.
|
||||
|
||||
1. [Create the org](http://docs.grafana.org/http_api/org/#create-organisation). Here is an example using curl:
|
||||
```
|
||||
curl -X POST -H "Content-Type: application/json" -d '{"name":"apiorg"}' http://admin:admin@localhost:3000/api/orgs
|
||||
```
|
||||
|
||||
This should return a response: `{"message":"Organization created","orgId":6}`. Use the orgId for the next steps.
|
||||
|
||||
2. Optional step. If the org was created previously and/or step 3 fails then first [add your Admin user to the org](http://docs.grafana.org/http_api/org/#add-user-in-organisation):
|
||||
```
|
||||
curl -X POST -H "Content-Type: application/json" -d '{"loginOrEmail":"admin", "role": "Admin"}' http://admin:admin@localhost:3000/api/orgs/<org id of new org>/users
|
||||
```
|
||||
|
||||
3. [Switch the org context for the Admin user to the new org](http://docs.grafana.org/http_api/user/#switch-user-context):
|
||||
```
|
||||
curl -X POST http://admin:admin@localhost:3000/api/user/using/<id of new org>
|
||||
```
|
||||
|
||||
4. [Create the API token](http://docs.grafana.org/http_api/auth/#create-api-key):
|
||||
```
|
||||
curl -X POST -H "Content-Type: application/json" -d '{"name":"apikeycurl", "role": "Admin"}' http://admin:admin@localhost:3000/api/auth/keys
|
||||
```
|
||||
|
||||
This should return a response: `{"name":"apikeycurl","key":"eyJrIjoiR0ZXZmt1UFc0OEpIOGN5RWdUalBJTllUTk83VlhtVGwiLCJuIjoiYXBpa2V5Y3VybCIsImlkIjo2fQ=="}`.
|
||||
|
||||
Save the key returned here in your password manager as it is not possible to fetch again it in the future.
|
||||
|
||||
## How To Add A Dashboard
|
||||
|
||||
Using the Token that was created in the previous step, you can create a dashboard or carry out other actions without having to switch organizations.
|
||||
|
||||
1. [Add a dashboard](http://docs.grafana.org/http_api/dashboard/#create-update-dashboard) using the key (or bearer token as it is also called):
|
||||
|
||||
```
|
||||
curl -X POST --insecure -H "Authorization: Bearer eyJrIjoiR0ZXZmt1UFc0OEpIOGN5RWdUalBJTllUTk83VlhtVGwiLCJuIjoiYXBpa2V5Y3VybCIsImlkIjo2fQ==" -H "Content-Type: application/json" -d '{
|
||||
"dashboard": {
|
||||
"id": null,
|
||||
"title": "Production Overview",
|
||||
"tags": [ "templated" ],
|
||||
"timezone": "browser",
|
||||
"rows": [
|
||||
{
|
||||
}
|
||||
],
|
||||
"schemaVersion": 6,
|
||||
"version": 0
|
||||
},
|
||||
"overwrite": false
|
||||
}' http://localhost:3000/api/dashboards/db
|
||||
```
|
||||
|
||||
This import will not work if you exported the dashboard via the Share -> Export menu in the Grafana UI (it strips out data source names etc.). View the JSON and save it to a file instead or fetch the dashboard JSON via the API.
|
@ -28,6 +28,29 @@
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
[[if ne .Error "" ]]
|
||||
<table class="row" >
|
||||
<tr>
|
||||
<td class="last">
|
||||
<center>
|
||||
<table class="twelve columns" >
|
||||
<tr>
|
||||
<td class="twelve last">
|
||||
<h5 style="font-weight: bold;">Error message</h5>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td class="twelve last">
|
||||
<p>[[.Error]]</p>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</center>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
[[end]]
|
||||
|
||||
[[if ne .State "ok" ]]
|
||||
<table class="row" >
|
||||
<tr>
|
||||
|
@ -4,7 +4,7 @@
|
||||
"company": "Coding Instinct AB"
|
||||
},
|
||||
"name": "grafana",
|
||||
"version": "4.3.0-beta1",
|
||||
"version": "4.4.0-pre1",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "http://github.com/grafana/grafana.git"
|
||||
|
@ -1,5 +1,5 @@
|
||||
#! /usr/bin/env bash
|
||||
version=4.2.0
|
||||
version=4.3.1
|
||||
|
||||
wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_${version}_amd64.deb
|
||||
|
||||
|
@ -41,6 +41,7 @@ func GetAnnotations(c *middleware.Context) Response {
|
||||
Title: item.Title,
|
||||
PanelId: item.PanelId,
|
||||
RegionId: item.RegionId,
|
||||
Type: string(item.Type),
|
||||
})
|
||||
}
|
||||
|
||||
|
@ -223,6 +223,13 @@ func (hs *HttpServer) registerRoutes() {
|
||||
// Dashboard
|
||||
r.Group("/dashboards", func() {
|
||||
r.Combo("/db/:slug").Get(GetDashboard).Delete(DeleteDashboard)
|
||||
|
||||
r.Get("/id/:dashboardId/versions", wrap(GetDashboardVersions))
|
||||
r.Get("/id/:dashboardId/versions/:id", wrap(GetDashboardVersion))
|
||||
r.Post("/id/:dashboardId/restore", reqEditorRole, bind(dtos.RestoreDashboardVersionCommand{}), wrap(RestoreDashboardVersion))
|
||||
|
||||
r.Post("/calculate-diff", bind(dtos.CalculateDiffOptions{}), wrap(CalculateDashboardDiff))
|
||||
|
||||
r.Post("/db", reqEditorRole, bind(m.SaveDashboardCommand{}), wrap(PostDashboard))
|
||||
r.Get("/file/:file", GetDashboardFromJsonFile)
|
||||
r.Get("/home", wrap(GetHomeDashboard))
|
||||
@ -253,6 +260,7 @@ func (hs *HttpServer) registerRoutes() {
|
||||
r.Post("/tsdb/query", bind(dtos.MetricRequest{}), wrap(QueryMetrics))
|
||||
r.Get("/tsdb/testdata/scenarios", wrap(GetTestDataScenarios))
|
||||
r.Get("/tsdb/testdata/gensql", reqGrafanaAdmin, wrap(GenerateSqlTestData))
|
||||
r.Get("/tsdb/testdata/random-walk", wrap(GetTestDataRandomWalk))
|
||||
|
||||
// metrics
|
||||
r.Get("/metrics", wrap(GetInternalMetrics))
|
||||
|
@ -30,12 +30,13 @@ var customMetricsDimensionsMap map[string]map[string]map[string]*CustomMetricsCa
|
||||
func init() {
|
||||
metricsMap = map[string][]string{
|
||||
"AWS/ApiGateway": {"4XXError", "5XXError", "CacheHitCount", "CacheMissCount", "Count", "IntegrationLatency", "Latency"},
|
||||
"AWS/ApplicationELB": {"ActiveConnectionCount", "ClientTLSNegotiationErrorCount", "HealthyHostCount", "HTTPCode_ELB_4XX_Count", "HTTPCode_ELB_5XX_Count", "HTTPCode_Target_2XX_Count", "HTTPCode_Target_3XX_Count", "HTTPCode_Target_4XX_Count", "HTTPCode_Target_5XX_Count", "NewConnectionCount", "ProcessedBytes", "RejectedConnectionCount", "RequestCount", "TargetConnectionErrorCount", "TargetResponseTime", "TargetTLSNegotiationErrorCount", "UnHealthyHostCount"},
|
||||
"AWS/ApplicationELB": {"ActiveConnectionCount", "ClientTLSNegotiationErrorCount", "HealthyHostCount", "HTTPCode_ELB_4XX_Count", "HTTPCode_ELB_5XX_Count", "HTTPCode_Target_2XX_Count", "HTTPCode_Target_3XX_Count", "HTTPCode_Target_4XX_Count", "HTTPCode_Target_5XX_Count", "IPv6ProcessedBytes", "IPv6RequestCount", "NewConnectionCount", "ProcessedBytes", "RejectedConnectionCount", "RequestCount", "TargetConnectionErrorCount", "TargetResponseTime", "TargetTLSNegotiationErrorCount", "UnHealthyHostCount"},
|
||||
"AWS/AutoScaling": {"GroupMinSize", "GroupMaxSize", "GroupDesiredCapacity", "GroupInServiceInstances", "GroupPendingInstances", "GroupStandbyInstances", "GroupTerminatingInstances", "GroupTotalInstances"},
|
||||
"AWS/Billing": {"EstimatedCharges"},
|
||||
"AWS/CloudFront": {"Requests", "BytesDownloaded", "BytesUploaded", "TotalErrorRate", "4xxErrorRate", "5xxErrorRate"},
|
||||
"AWS/CloudSearch": {"SuccessfulRequests", "SearchableDocuments", "IndexUtilization", "Partitions"},
|
||||
"AWS/DynamoDB": {"ConditionalCheckFailedRequests", "ConsumedReadCapacityUnits", "ConsumedWriteCapacityUnits", "OnlineIndexConsumedWriteCapacity", "OnlineIndexPercentageProgress", "OnlineIndexThrottleEvents", "ProvisionedReadCapacityUnits", "ProvisionedWriteCapacityUnits", "ReadThrottleEvents", "ReturnedBytes", "ReturnedItemCount", "ReturnedRecordsCount", "SuccessfulRequestLatency", "SystemErrors", "ThrottledRequests", "UserErrors", "WriteThrottleEvents"},
|
||||
"AWS/DMS": {"FreeableMemory", "WriteIOPS", "ReadIOPS", "WriteThroughput", "ReadThroughput", "WriteLatency", "ReadLatency", "SwapUsage", "NetworkTransmitThroughput", "NetworkReceiveThroughput", "FullLoadThroughputBandwidthSource", "FullLoadThroughputBandwidthTarget", "FullLoadThroughputRowsSource", "FullLoadThroughputRowsTarget", "CDCIncomingChanges", "CDCChangesMemorySource", "CDCChangesMemoryTarget", "CDCChangesDiskSource", "CDCChangesDiskTarget", "CDCThroughputBandwidthTarget", "CDCThroughputRowsSource", "CDCThroughputRowsTarget", "CDCLatencySource", "CDCLatencyTarget"},
|
||||
"AWS/DynamoDB": {"ConditionalCheckFailedRequests", "ConsumedReadCapacityUnits", "ConsumedWriteCapacityUnits", "OnlineIndexConsumedWriteCapacity", "OnlineIndexPercentageProgress", "OnlineIndexThrottleEvents", "ProvisionedReadCapacityUnits", "ProvisionedWriteCapacityUnits", "ReadThrottleEvents", "ReturnedBytes", "ReturnedItemCount", "ReturnedRecordsCount", "SuccessfulRequestLatency", "SystemErrors", "TimeToLiveDeletedItemCount", "ThrottledRequests", "UserErrors", "WriteThrottleEvents"},
|
||||
"AWS/EBS": {"VolumeReadBytes", "VolumeWriteBytes", "VolumeReadOps", "VolumeWriteOps", "VolumeTotalReadTime", "VolumeTotalWriteTime", "VolumeIdleTime", "VolumeQueueLength", "VolumeThroughputPercentage", "VolumeConsumedReadWriteOps", "BurstBalance"},
|
||||
"AWS/EC2": {"CPUCreditUsage", "CPUCreditBalance", "CPUUtilization", "DiskReadOps", "DiskWriteOps", "DiskReadBytes", "DiskWriteBytes", "NetworkIn", "NetworkOut", "NetworkPacketsIn", "NetworkPacketsOut", "StatusCheckFailed", "StatusCheckFailed_Instance", "StatusCheckFailed_System"},
|
||||
"AWS/EC2Spot": {"AvailableInstancePoolsCount", "BidsSubmittedForCapacity", "EligibleInstancePoolCount", "FulfilledCapacity", "MaxPercentCapacityAllocation", "PendingCapacity", "PercentCapacityAllocation", "TargetCapacity", "TerminatingCapacity"},
|
||||
@ -68,27 +69,28 @@ func init() {
|
||||
"CoreNodesRunning", "CoreNodesPending", "LiveDataNodes", "MRTotalNodes", "MRActiveNodes", "MRLostNodes", "MRUnhealthyNodes", "MRDecommissionedNodes", "MRRebootedNodes",
|
||||
"S3BytesWritten", "S3BytesRead", "HDFSUtilization", "HDFSBytesRead", "HDFSBytesWritten", "MissingBlocks", "CorruptBlocks", "TotalLoad", "MemoryTotalMB", "MemoryReservedMB", "MemoryAvailableMB", "MemoryAllocatedMB", "PendingDeletionBlocks", "UnderReplicatedBlocks", "DfsPendingReplicationBlocks", "CapacityRemainingGB",
|
||||
"HbaseBackupFailed", "MostRecentBackupDuration", "TimeSinceLastSuccessfulBackup"},
|
||||
"AWS/ES": {"ClusterStatus.green", "ClusterStatus.yellow", "ClusterStatus.red", "Nodes", "SearchableDocuments", "DeletedDocuments", "CPUUtilization", "FreeStorageSpace", "JVMMemoryPressure", "AutomatedSnapshotFailure", "MasterCPUUtilization", "MasterFreeStorageSpace", "MasterJVMMemoryPressure", "ReadLatency", "WriteLatency", "ReadThroughput", "WriteThroughput", "DiskQueueLength", "ReadIOPS", "WriteIOPS"},
|
||||
"AWS/ES": {"ClusterStatus.green", "ClusterStatus.yellow", "ClusterStatus.red", "ClusterUsedSpace", "Nodes", "SearchableDocuments", "DeletedDocuments", "CPUCreditBalance", "CPUUtilization", "FreeStorageSpace", "JVMMemoryPressure", "AutomatedSnapshotFailure", "MasterCPUCreditBalance", "MasterCPUUtilization", "MasterFreeStorageSpace", "MasterJVMMemoryPressure", "ReadLatency", "WriteLatency", "ReadThroughput", "WriteThroughput", "DiskQueueDepth", "ReadIOPS", "WriteIOPS"},
|
||||
"AWS/Events": {"Invocations", "FailedInvocations", "TriggeredRules", "MatchedEvents", "ThrottledRules"},
|
||||
"AWS/Firehose": {"DeliveryToElasticsearch.Bytes", "DeliveryToElasticsearch.Records", "DeliveryToElasticsearch.Success", "DeliveryToRedshift.Bytes", "DeliveryToRedshift.Records", "DeliveryToRedshift.Success", "DeliveryToS3.Bytes", "DeliveryToS3.DataFreshness", "DeliveryToS3.Records", "DeliveryToS3.Success", "IncomingBytes", "IncomingRecords", "DescribeDeliveryStream.Latency", "DescribeDeliveryStream.Requests", "ListDeliveryStreams.Latency", "ListDeliveryStreams.Requests", "PutRecord.Bytes", "PutRecord.Latency", "PutRecord.Requests", "PutRecordBatch.Bytes", "PutRecordBatch.Latency", "PutRecordBatch.Records", "PutRecordBatch.Requests", "UpdateDeliveryStream.Latency", "UpdateDeliveryStream.Requests"},
|
||||
"AWS/IoT": {"PublishIn.Success", "PublishOut.Success", "Subscribe.Success", "Ping.Success", "Connect.Success", "GetThingShadow.Accepted"},
|
||||
"AWS/Kinesis": {"GetRecords.Bytes", "GetRecords.IteratorAge", "GetRecords.IteratorAgeMilliseconds", "GetRecords.Latency", "GetRecords.Records", "GetRecords.Success", "IncomingBytes", "IncomingRecords", "PutRecord.Bytes", "PutRecord.Latency", "PutRecord.Success", "PutRecords.Bytes", "PutRecords.Latency", "PutRecords.Records", "PutRecords.Success", "ReadProvisionedThroughputExceeded", "WriteProvisionedThroughputExceeded", "IteratorAgeMilliseconds", "OutgoingBytes", "OutgoingRecords"},
|
||||
"AWS/KinesisAnalytics": {"Bytes", "MillisBehindLatest", "Records", "Success"},
|
||||
"AWS/Lambda": {"Invocations", "Errors", "Duration", "Throttles"},
|
||||
"AWS/Lambda": {"Invocations", "Errors", "Duration", "Throttles", "IteratorAge"},
|
||||
"AWS/Logs": {"IncomingBytes", "IncomingLogEvents", "ForwardedBytes", "ForwardedLogEvents", "DeliveryErrors", "DeliveryThrottling"},
|
||||
"AWS/ML": {"PredictCount", "PredictFailureCount"},
|
||||
"AWS/OpsWorks": {"cpu_idle", "cpu_nice", "cpu_system", "cpu_user", "cpu_waitio", "load_1", "load_5", "load_15", "memory_buffers", "memory_cached", "memory_free", "memory_swap", "memory_total", "memory_used", "procs"},
|
||||
"AWS/Redshift": {"CPUUtilization", "DatabaseConnections", "HealthStatus", "MaintenanceMode", "NetworkReceiveThroughput", "NetworkTransmitThroughput", "PercentageDiskSpaceUsed", "ReadIOPS", "ReadLatency", "ReadThroughput", "WriteIOPS", "WriteLatency", "WriteThroughput"},
|
||||
"AWS/RDS": {"ActiveTransactions", "AuroraBinlogReplicaLag", "AuroraReplicaLag", "AuroraReplicaLagMaximum", "AuroraReplicaLagMinimum", "BinLogDiskUsage", "BlockedTransactions", "BufferCacheHitRatio", "CommitLatency", "CommitThroughput", "CPUCreditBalance", "CPUCreditUsage", "CPUUtilization", "DatabaseConnections", "DDLLatency", "DDLThroughput", "Deadlocks", "DiskQueueDepth", "DMLLatency", "DMLThroughput", "FailedSqlStatements", "FreeableMemory", "FreeStorageSpace", "LoginFailures", "NetworkReceiveThroughput", "NetworkTransmitThroughput", "ReadIOPS", "ReadLatency", "ReadThroughput", "ReplicaLag", "ResultSetCacheHitRatio", "SelectLatency", "SelectThroughput", "SwapUsage", "TotalConnections", "VolumeReadIOPS", "VolumeWriteIOPS", "WriteIOPS", "WriteLatency", "WriteThroughput"},
|
||||
"AWS/Route53": {"HealthCheckStatus", "HealthCheckPercentageHealthy", "ConnectionTime", "SSLHandshakeTime", "TimeToFirstByte"},
|
||||
"AWS/RDS": {"ActiveTransactions", "AuroraBinlogReplicaLag", "AuroraReplicaLag", "AuroraReplicaLagMaximum", "AuroraReplicaLagMinimum", "BinLogDiskUsage", "BlockedTransactions", "BufferCacheHitRatio", "CommitLatency", "CommitThroughput", "BinLogDiskUsage", "CPUCreditBalance", "CPUCreditUsage", "CPUUtilization", "DatabaseConnections", "DDLLatency", "DDLThroughput", "Deadlocks", "DeleteLatency", "DeleteThroughput", "DiskQueueDepth", "DMLLatency", "DMLThroughput", "EngineUptime", "FailedSqlStatements", "FreeableMemory", "FreeLocalStorage", "FreeStorageSpace", "InsertLatency", "InsertThroughput", "LoginFailures", "NetworkReceiveThroughput", "NetworkTransmitThroughput", "NetworkThroughput", "Queries", "ReadIOPS", "ReadLatency", "ReadThroughput", "ReplicaLag", "ResultSetCacheHitRatio", "SelectLatency", "SelectThroughput", "SwapUsage", "TotalConnections", "UpdateLatency", "UpdateThroughput", "VolumeBytesUsed", "VolumeReadIOPS", "VolumeWriteIOPS", "WriteIOPS", "WriteLatency", "WriteThroughput"},
|
||||
"AWS/Route53": {"ChildHealthCheckHealthyCount", "HealthCheckStatus", "HealthCheckPercentageHealthy", "ConnectionTime", "SSLHandshakeTime", "TimeToFirstByte"},
|
||||
"AWS/S3": {"BucketSizeBytes", "NumberOfObjects", "AllRequests", "GetRequests", "PutRequests", "DeleteRequests", "HeadRequests", "PostRequests", "ListRequests", "BytesDownloaded", "BytesUploaded", "4xxErrors", "5xxErrors", "FirstByteLatency", "TotalRequestLatency"},
|
||||
"AWS/SES": {"Bounce", "Complaint", "Delivery", "Reject", "Send"},
|
||||
"AWS/SNS": {"NumberOfMessagesPublished", "PublishSize", "NumberOfNotificationsDelivered", "NumberOfNotificationsFailed"},
|
||||
"AWS/SQS": {"NumberOfMessagesSent", "SentMessageSize", "NumberOfMessagesReceived", "NumberOfEmptyReceives", "NumberOfMessagesDeleted", "ApproximateNumberOfMessagesDelayed", "ApproximateNumberOfMessagesVisible", "ApproximateNumberOfMessagesNotVisible"},
|
||||
"AWS/SQS": {"NumberOfMessagesSent", "SentMessageSize", "NumberOfMessagesReceived", "NumberOfEmptyReceives", "NumberOfMessagesDeleted", "ApproximateAgeOfOldestMessage", "ApproximateNumberOfMessagesDelayed", "ApproximateNumberOfMessagesVisible", "ApproximateNumberOfMessagesNotVisible"},
|
||||
"AWS/StorageGateway": {"CacheHitPercent", "CachePercentUsed", "CachePercentDirty", "CloudBytesDownloaded", "CloudDownloadLatency", "CloudBytesUploaded", "UploadBufferFree", "UploadBufferPercentUsed", "UploadBufferUsed", "QueuedWrites", "ReadBytes", "ReadTime", "TotalCacheSize", "WriteBytes", "WriteTime", "TimeSinceLastRecoveryPoint", "WorkingStorageFree", "WorkingStoragePercentUsed", "WorkingStorageUsed",
|
||||
"CacheHitPercent", "CachePercentUsed", "CachePercentDirty", "ReadBytes", "ReadTime", "WriteBytes", "WriteTime", "QueuedWrites"},
|
||||
"AWS/SWF": {"DecisionTaskScheduleToStartTime", "DecisionTaskStartToCloseTime", "DecisionTasksCompleted", "StartedDecisionTasksTimedOutOnClose", "WorkflowStartToCloseTime", "WorkflowsCanceled", "WorkflowsCompleted", "WorkflowsContinuedAsNew", "WorkflowsFailed", "WorkflowsTerminated", "WorkflowsTimedOut",
|
||||
"ActivityTaskScheduleToCloseTime", "ActivityTaskScheduleToStartTime", "ActivityTaskStartToCloseTime", "ActivityTasksCanceled", "ActivityTasksCompleted", "ActivityTasksFailed", "ScheduledActivityTasksTimedOutOnClose", "ScheduledActivityTasksTimedOutOnStart", "StartedActivityTasksTimedOutOnClose", "StartedActivityTasksTimedOutOnHeartbeat"},
|
||||
"AWS/VPN": {"TunnelState", "TunnelDataIn", "TunnelDataOut"},
|
||||
"AWS/WAF": {"AllowedRequests", "BlockedRequests", "CountedRequests"},
|
||||
"AWS/WorkSpaces": {"Available", "Unhealthy", "ConnectionAttempt", "ConnectionSuccess", "ConnectionFailure", "SessionLaunchTime", "InSessionLatency", "SessionDisconnect"},
|
||||
"KMS": {"SecondsUntilKeyMaterialExpiration"},
|
||||
@ -100,6 +102,7 @@ func init() {
|
||||
"AWS/Billing": {"ServiceName", "LinkedAccount", "Currency"},
|
||||
"AWS/CloudFront": {"DistributionId", "Region"},
|
||||
"AWS/CloudSearch": {},
|
||||
"AWS/DMS": {"ReplicationInstanceIdentifier", "ReplicationTaskIdentifier"},
|
||||
"AWS/DynamoDB": {"TableName", "GlobalSecondaryIndexName", "Operation", "StreamLabel"},
|
||||
"AWS/EBS": {"VolumeId"},
|
||||
"AWS/EC2": {"AutoScalingGroupName", "ImageId", "InstanceId", "InstanceType"},
|
||||
@ -121,14 +124,15 @@ func init() {
|
||||
"AWS/ML": {"MLModelId", "RequestMode"},
|
||||
"AWS/OpsWorks": {"StackId", "LayerId", "InstanceId"},
|
||||
"AWS/Redshift": {"NodeID", "ClusterIdentifier"},
|
||||
"AWS/RDS": {"DBInstanceIdentifier", "DBClusterIdentifier", "DatabaseClass", "EngineName"},
|
||||
"AWS/Route53": {"HealthCheckId"},
|
||||
"AWS/RDS": {"DBInstanceIdentifier", "DBClusterIdentifier", "DatabaseClass", "EngineName", "Role"},
|
||||
"AWS/Route53": {"HealthCheckId", "Region"},
|
||||
"AWS/S3": {"BucketName", "StorageType", "FilterId"},
|
||||
"AWS/SES": {},
|
||||
"AWS/SNS": {"Application", "Platform", "TopicName"},
|
||||
"AWS/SQS": {"QueueName"},
|
||||
"AWS/StorageGateway": {"GatewayId", "GatewayName", "VolumeId"},
|
||||
"AWS/SWF": {"Domain", "WorkflowTypeName", "WorkflowTypeVersion", "ActivityTypeName", "ActivityTypeVersion"},
|
||||
"AWS/VPN": {"VpnId", "TunnelIpAddress"},
|
||||
"AWS/WAF": {"Rule", "WebACL"},
|
||||
"AWS/WorkSpaces": {"DirectoryId", "WorkspaceId"},
|
||||
"KMS": {"KeyId"},
|
||||
|
@ -2,12 +2,14 @@ package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"os"
|
||||
"path"
|
||||
"strings"
|
||||
|
||||
"github.com/grafana/grafana/pkg/api/dtos"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/dashdiffs"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
@ -60,6 +62,9 @@ func GetDashboard(c *middleware.Context) {
|
||||
creator = getUserLogin(dash.CreatedBy)
|
||||
}
|
||||
|
||||
// make sure db version is in sync with json model version
|
||||
dash.Data.Set("version", dash.Version)
|
||||
|
||||
dto := dtos.DashboardFullWithMeta{
|
||||
Dashboard: dash.Data,
|
||||
Meta: dtos.DashboardMeta{
|
||||
@ -77,6 +82,7 @@ func GetDashboard(c *middleware.Context) {
|
||||
},
|
||||
}
|
||||
|
||||
// TODO(ben): copy this performance metrics logic for the new API endpoints added
|
||||
c.TimeRequest(metrics.M_Api_Dashboard_Get)
|
||||
c.JSON(200, dto)
|
||||
}
|
||||
@ -114,18 +120,15 @@ func DeleteDashboard(c *middleware.Context) {
|
||||
|
||||
func PostDashboard(c *middleware.Context, cmd m.SaveDashboardCommand) Response {
|
||||
cmd.OrgId = c.OrgId
|
||||
|
||||
if !c.IsSignedIn {
|
||||
cmd.UserId = -1
|
||||
} else {
|
||||
cmd.UserId = c.UserId
|
||||
}
|
||||
cmd.UserId = c.UserId
|
||||
|
||||
dash := cmd.GetDashboardModel()
|
||||
|
||||
// Check if Title is empty
|
||||
if dash.Title == "" {
|
||||
return ApiError(400, m.ErrDashboardTitleEmpty.Error(), nil)
|
||||
}
|
||||
|
||||
if dash.Id == 0 {
|
||||
limitReached, err := middleware.QuotaReached(c, "dashboard")
|
||||
if err != nil {
|
||||
@ -255,6 +258,135 @@ func GetDashboardFromJsonFile(c *middleware.Context) {
|
||||
c.JSON(200, &dash)
|
||||
}
|
||||
|
||||
// GetDashboardVersions returns all dashboard versions as JSON
|
||||
func GetDashboardVersions(c *middleware.Context) Response {
|
||||
dashboardId := c.ParamsInt64(":dashboardId")
|
||||
limit := c.QueryInt("limit")
|
||||
start := c.QueryInt("start")
|
||||
|
||||
if limit == 0 {
|
||||
limit = 1000
|
||||
}
|
||||
|
||||
query := m.GetDashboardVersionsQuery{
|
||||
OrgId: c.OrgId,
|
||||
DashboardId: dashboardId,
|
||||
Limit: limit,
|
||||
Start: start,
|
||||
}
|
||||
|
||||
if err := bus.Dispatch(&query); err != nil {
|
||||
return ApiError(404, fmt.Sprintf("No versions found for dashboardId %d", dashboardId), err)
|
||||
}
|
||||
|
||||
for _, version := range query.Result {
|
||||
if version.RestoredFrom == version.Version {
|
||||
version.Message = "Initial save (created by migration)"
|
||||
continue
|
||||
}
|
||||
|
||||
if version.RestoredFrom > 0 {
|
||||
version.Message = fmt.Sprintf("Restored from version %d", version.RestoredFrom)
|
||||
continue
|
||||
}
|
||||
|
||||
if version.ParentVersion == 0 {
|
||||
version.Message = "Initial save"
|
||||
}
|
||||
}
|
||||
|
||||
return Json(200, query.Result)
|
||||
}
|
||||
|
||||
// GetDashboardVersion returns the dashboard version with the given ID.
|
||||
func GetDashboardVersion(c *middleware.Context) Response {
|
||||
dashboardId := c.ParamsInt64(":dashboardId")
|
||||
version := c.ParamsInt(":id")
|
||||
|
||||
query := m.GetDashboardVersionQuery{
|
||||
OrgId: c.OrgId,
|
||||
DashboardId: dashboardId,
|
||||
Version: version,
|
||||
}
|
||||
|
||||
if err := bus.Dispatch(&query); err != nil {
|
||||
return ApiError(500, fmt.Sprintf("Dashboard version %d not found for dashboardId %d", version, dashboardId), err)
|
||||
}
|
||||
|
||||
creator := "Anonymous"
|
||||
if query.Result.CreatedBy > 0 {
|
||||
creator = getUserLogin(query.Result.CreatedBy)
|
||||
}
|
||||
|
||||
dashVersionMeta := &m.DashboardVersionMeta{
|
||||
DashboardVersion: *query.Result,
|
||||
CreatedBy: creator,
|
||||
}
|
||||
|
||||
return Json(200, dashVersionMeta)
|
||||
}
|
||||
|
||||
// POST /api/dashboards/calculate-diff performs diffs on two dashboards
|
||||
func CalculateDashboardDiff(c *middleware.Context, apiOptions dtos.CalculateDiffOptions) Response {
|
||||
|
||||
options := dashdiffs.Options{
|
||||
OrgId: c.OrgId,
|
||||
DiffType: dashdiffs.ParseDiffType(apiOptions.DiffType),
|
||||
Base: dashdiffs.DiffTarget{
|
||||
DashboardId: apiOptions.Base.DashboardId,
|
||||
Version: apiOptions.Base.Version,
|
||||
UnsavedDashboard: apiOptions.Base.UnsavedDashboard,
|
||||
},
|
||||
New: dashdiffs.DiffTarget{
|
||||
DashboardId: apiOptions.New.DashboardId,
|
||||
Version: apiOptions.New.Version,
|
||||
UnsavedDashboard: apiOptions.New.UnsavedDashboard,
|
||||
},
|
||||
}
|
||||
|
||||
result, err := dashdiffs.CalculateDiff(&options)
|
||||
if err != nil {
|
||||
if err == m.ErrDashboardVersionNotFound {
|
||||
return ApiError(404, "Dashboard version not found", err)
|
||||
}
|
||||
return ApiError(500, "Unable to compute diff", err)
|
||||
}
|
||||
|
||||
if options.DiffType == dashdiffs.DiffDelta {
|
||||
return Respond(200, result.Delta).Header("Content-Type", "application/json")
|
||||
} else {
|
||||
return Respond(200, result.Delta).Header("Content-Type", "text/html")
|
||||
}
|
||||
}
|
||||
|
||||
// RestoreDashboardVersion restores a dashboard to the given version.
|
||||
func RestoreDashboardVersion(c *middleware.Context, apiCmd dtos.RestoreDashboardVersionCommand) Response {
|
||||
dashboardId := c.ParamsInt64(":dashboardId")
|
||||
|
||||
dashQuery := m.GetDashboardQuery{Id: dashboardId, OrgId: c.OrgId}
|
||||
if err := bus.Dispatch(&dashQuery); err != nil {
|
||||
return ApiError(404, "Dashboard not found", nil)
|
||||
}
|
||||
|
||||
versionQuery := m.GetDashboardVersionQuery{DashboardId: dashboardId, Version: apiCmd.Version, OrgId: c.OrgId}
|
||||
if err := bus.Dispatch(&versionQuery); err != nil {
|
||||
return ApiError(404, "Dashboard version not found", nil)
|
||||
}
|
||||
|
||||
dashboard := dashQuery.Result
|
||||
version := versionQuery.Result
|
||||
|
||||
saveCmd := m.SaveDashboardCommand{}
|
||||
saveCmd.RestoredFrom = version.Version
|
||||
saveCmd.OrgId = c.OrgId
|
||||
saveCmd.UserId = c.UserId
|
||||
saveCmd.Dashboard = version.Data
|
||||
saveCmd.Dashboard.Set("version", dashboard.Version)
|
||||
saveCmd.Message = fmt.Sprintf("Restored from version %d", version.Version)
|
||||
|
||||
return PostDashboard(c, saveCmd)
|
||||
}
|
||||
|
||||
func GetDashboardTags(c *middleware.Context) {
|
||||
query := m.GetDashboardTagsQuery{OrgId: c.OrgId}
|
||||
err := bus.Dispatch(&query)
|
||||
|
@ -3,6 +3,7 @@ package api
|
||||
import (
|
||||
"bytes"
|
||||
"io/ioutil"
|
||||
"net"
|
||||
"net/http"
|
||||
"net/http/httputil"
|
||||
"net/url"
|
||||
@ -62,6 +63,27 @@ func NewReverseProxy(ds *m.DataSource, proxyPath string, targetUrl *url.URL) *ht
|
||||
// clear cookie headers
|
||||
req.Header.Del("Cookie")
|
||||
req.Header.Del("Set-Cookie")
|
||||
|
||||
// clear X-Forwarded Host/Port/Proto headers
|
||||
req.Header.Del("X-Forwarded-Host")
|
||||
req.Header.Del("X-Forwarded-Port")
|
||||
req.Header.Del("X-Forwarded-Proto")
|
||||
|
||||
// set X-Forwarded-For header
|
||||
if req.RemoteAddr != "" {
|
||||
remoteAddr, _, err := net.SplitHostPort(req.RemoteAddr)
|
||||
if err != nil {
|
||||
remoteAddr = req.RemoteAddr
|
||||
}
|
||||
if req.Header.Get("X-Forwarded-For") != "" {
|
||||
req.Header.Set("X-Forwarded-For", req.Header.Get("X-Forwarded-For")+", "+remoteAddr)
|
||||
} else {
|
||||
req.Header.Set("X-Forwarded-For", remoteAddr)
|
||||
}
|
||||
}
|
||||
|
||||
// reqBytes, _ := httputil.DumpRequestOut(req, true);
|
||||
// log.Trace("Proxying datasource request: %s", string(reqBytes))
|
||||
}
|
||||
|
||||
return &httputil.ReverseProxy{Director: director, FlushInterval: time.Millisecond * 200}
|
||||
|
@ -13,6 +13,7 @@ type Annotation struct {
|
||||
Text string `json:"text"`
|
||||
Metric string `json:"metric"`
|
||||
RegionId int64 `json:"regionId"`
|
||||
Type string `json:"type"`
|
||||
|
||||
Data *simplejson.Json `json:"data"`
|
||||
}
|
||||
|
49
pkg/api/dtos/dashboard.go
Normal file
49
pkg/api/dtos/dashboard.go
Normal file
@ -0,0 +1,49 @@
|
||||
package dtos
|
||||
|
||||
import (
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
)
|
||||
|
||||
type DashboardMeta struct {
|
||||
IsStarred bool `json:"isStarred,omitempty"`
|
||||
IsHome bool `json:"isHome,omitempty"`
|
||||
IsSnapshot bool `json:"isSnapshot,omitempty"`
|
||||
Type string `json:"type,omitempty"`
|
||||
CanSave bool `json:"canSave"`
|
||||
CanEdit bool `json:"canEdit"`
|
||||
CanStar bool `json:"canStar"`
|
||||
Slug string `json:"slug"`
|
||||
Expires time.Time `json:"expires"`
|
||||
Created time.Time `json:"created"`
|
||||
Updated time.Time `json:"updated"`
|
||||
UpdatedBy string `json:"updatedBy"`
|
||||
CreatedBy string `json:"createdBy"`
|
||||
Version int `json:"version"`
|
||||
}
|
||||
|
||||
type DashboardFullWithMeta struct {
|
||||
Meta DashboardMeta `json:"meta"`
|
||||
Dashboard *simplejson.Json `json:"dashboard"`
|
||||
}
|
||||
|
||||
type DashboardRedirect struct {
|
||||
RedirectUri string `json:"redirectUri"`
|
||||
}
|
||||
|
||||
type CalculateDiffOptions struct {
|
||||
Base CalculateDiffTarget `json:"base" binding:"Required"`
|
||||
New CalculateDiffTarget `json:"new" binding:"Required"`
|
||||
DiffType string `json:"diffType" binding:"Required"`
|
||||
}
|
||||
|
||||
type CalculateDiffTarget struct {
|
||||
DashboardId int64 `json:"dashboardId"`
|
||||
Version int `json:"version"`
|
||||
UnsavedDashboard *simplejson.Json `json:"unsavedDashboard"`
|
||||
}
|
||||
|
||||
type RestoreDashboardVersionCommand struct {
|
||||
Version int `json:"version" binding:"Required"`
|
||||
}
|
@ -4,7 +4,6 @@ import (
|
||||
"crypto/md5"
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
@ -38,32 +37,6 @@ type CurrentUser struct {
|
||||
HelpFlags1 m.HelpFlags1 `json:"helpFlags1"`
|
||||
}
|
||||
|
||||
type DashboardMeta struct {
|
||||
IsStarred bool `json:"isStarred,omitempty"`
|
||||
IsHome bool `json:"isHome,omitempty"`
|
||||
IsSnapshot bool `json:"isSnapshot,omitempty"`
|
||||
Type string `json:"type,omitempty"`
|
||||
CanSave bool `json:"canSave"`
|
||||
CanEdit bool `json:"canEdit"`
|
||||
CanStar bool `json:"canStar"`
|
||||
Slug string `json:"slug"`
|
||||
Expires time.Time `json:"expires"`
|
||||
Created time.Time `json:"created"`
|
||||
Updated time.Time `json:"updated"`
|
||||
UpdatedBy string `json:"updatedBy"`
|
||||
CreatedBy string `json:"createdBy"`
|
||||
Version int `json:"version"`
|
||||
}
|
||||
|
||||
type DashboardFullWithMeta struct {
|
||||
Meta DashboardMeta `json:"meta"`
|
||||
Dashboard *simplejson.Json `json:"dashboard"`
|
||||
}
|
||||
|
||||
type DashboardRedirect struct {
|
||||
RedirectUri string `json:"redirectUri"`
|
||||
}
|
||||
|
||||
type DataSource struct {
|
||||
Id int64 `json:"id"`
|
||||
OrgId int64 `json:"orgId"`
|
||||
|
@ -13,7 +13,7 @@ import (
|
||||
"github.com/grafana/grafana/pkg/util"
|
||||
)
|
||||
|
||||
var gNetProxyTransport = &http.Transport{
|
||||
var grafanaComProxyTransport = &http.Transport{
|
||||
TLSClientConfig: &tls.Config{InsecureSkipVerify: false},
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
Dial: (&net.Dialer{
|
||||
@ -24,7 +24,7 @@ var gNetProxyTransport = &http.Transport{
|
||||
}
|
||||
|
||||
func ReverseProxyGnetReq(proxyPath string) *httputil.ReverseProxy {
|
||||
url, _ := url.Parse(setting.GrafanaNetUrl)
|
||||
url, _ := url.Parse(setting.GrafanaComUrl)
|
||||
|
||||
director := func(req *http.Request) {
|
||||
req.URL.Scheme = url.Scheme
|
||||
@ -45,7 +45,7 @@ func ReverseProxyGnetReq(proxyPath string) *httputil.ReverseProxy {
|
||||
func ProxyGnetRequest(c *middleware.Context) {
|
||||
proxyPath := c.Params("*")
|
||||
proxy := ReverseProxyGnetReq(proxyPath)
|
||||
proxy.Transport = gNetProxyTransport
|
||||
proxy.Transport = grafanaComProxyTransport
|
||||
proxy.ServeHTTP(c.Resp, c.Req.Request)
|
||||
c.Resp.Header().Del("Set-Cookie")
|
||||
}
|
@ -61,7 +61,7 @@ func (hs *HttpServer) Start(ctx context.Context) error {
|
||||
return nil
|
||||
}
|
||||
case setting.HTTPS:
|
||||
err = hs.httpSrv.ListenAndServeTLS(setting.CertFile, setting.KeyFile)
|
||||
err = hs.listenAndServeTLS(setting.CertFile, setting.KeyFile)
|
||||
if err == http.ErrServerClosed {
|
||||
hs.log.Debug("server was shutdown gracefully")
|
||||
return nil
|
||||
@ -92,7 +92,7 @@ func (hs *HttpServer) Shutdown(ctx context.Context) error {
|
||||
return err
|
||||
}
|
||||
|
||||
func (hs *HttpServer) listenAndServeTLS(listenAddr, certfile, keyfile string) error {
|
||||
func (hs *HttpServer) listenAndServeTLS(certfile, keyfile string) error {
|
||||
if certfile == "" {
|
||||
return fmt.Errorf("cert_file cannot be empty when using HTTPS")
|
||||
}
|
||||
@ -127,14 +127,11 @@ func (hs *HttpServer) listenAndServeTLS(listenAddr, certfile, keyfile string) er
|
||||
tls.TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
|
||||
},
|
||||
}
|
||||
srv := &http.Server{
|
||||
Addr: listenAddr,
|
||||
Handler: hs.macaron,
|
||||
TLSConfig: tlsCfg,
|
||||
TLSNextProto: make(map[string]func(*http.Server, *tls.Conn, http.Handler), 0),
|
||||
}
|
||||
|
||||
return srv.ListenAndServeTLS(setting.CertFile, setting.KeyFile)
|
||||
hs.httpSrv.TLSConfig = tlsCfg
|
||||
hs.httpSrv.TLSNextProto = make(map[string]func(*http.Server, *tls.Conn, http.Handler), 0)
|
||||
|
||||
return hs.httpSrv.ListenAndServeTLS(setting.CertFile, setting.KeyFile)
|
||||
}
|
||||
|
||||
func (hs *HttpServer) newMacaron() *macaron.Macaron {
|
||||
|
@ -28,6 +28,7 @@ var (
|
||||
ErrEmailNotAllowed = errors.New("Required email domain not fulfilled")
|
||||
ErrSignUpNotAllowed = errors.New("Signup is not allowed for this adapter")
|
||||
ErrUsersQuotaReached = errors.New("Users quota reached")
|
||||
ErrNoEmail = errors.New("Login provider didn't return an email address")
|
||||
)
|
||||
|
||||
func GenStateString() string {
|
||||
@ -63,7 +64,7 @@ func OAuthLogin(ctx *middleware.Context) {
|
||||
if setting.OAuthService.OAuthInfos[name].HostedDomain == "" {
|
||||
ctx.Redirect(connect.AuthCodeURL(state, oauth2.AccessTypeOnline))
|
||||
} else {
|
||||
ctx.Redirect(connect.AuthCodeURL(state, oauth2.SetParam("hd", setting.OAuthService.OAuthInfos[name].HostedDomain), oauth2.AccessTypeOnline))
|
||||
ctx.Redirect(connect.AuthCodeURL(state, oauth2.SetAuthURLParam("hd", setting.OAuthService.OAuthInfos[name].HostedDomain), oauth2.AccessTypeOnline))
|
||||
}
|
||||
return
|
||||
}
|
||||
@ -134,6 +135,12 @@ func OAuthLogin(ctx *middleware.Context) {
|
||||
|
||||
ctx.Logger.Debug("OAuthLogin got user info", "userInfo", userInfo)
|
||||
|
||||
// validate that we got at least an email address
|
||||
if userInfo.Email == "" {
|
||||
redirectWithError(ctx, ErrNoEmail)
|
||||
return
|
||||
}
|
||||
|
||||
// validate that the email is allowed to login to grafana
|
||||
if !connect.IsEmailAllowed(userInfo.Email) {
|
||||
redirectWithError(ctx, ErrEmailNotAllowed)
|
||||
|
@ -7,6 +7,7 @@ import (
|
||||
|
||||
"github.com/grafana/grafana/pkg/api/dtos"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
"github.com/grafana/grafana/pkg/middleware"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@ -144,3 +145,29 @@ func GenerateSqlTestData(c *middleware.Context) Response {
|
||||
|
||||
return Json(200, &util.DynMap{"message": "OK"})
|
||||
}
|
||||
|
||||
// GET /api/tsdb/testdata/random-walk
|
||||
func GetTestDataRandomWalk(c *middleware.Context) Response {
|
||||
from := c.Query("from")
|
||||
to := c.Query("to")
|
||||
intervalMs := c.QueryInt64("intervalMs")
|
||||
|
||||
timeRange := tsdb.NewTimeRange(from, to)
|
||||
request := &tsdb.Request{TimeRange: timeRange}
|
||||
|
||||
request.Queries = append(request.Queries, &tsdb.Query{
|
||||
RefId: "A",
|
||||
IntervalMs: intervalMs,
|
||||
Model: simplejson.NewFromAny(&util.DynMap{
|
||||
"scenario": "random_walk",
|
||||
}),
|
||||
DataSource: &models.DataSource{Type: "grafana-testdata-datasource"},
|
||||
})
|
||||
|
||||
resp, err := tsdb.HandleRequest(context.Background(), request)
|
||||
if err != nil {
|
||||
return ApiError(500, "Metric request error", err)
|
||||
}
|
||||
|
||||
return Json(200, &resp)
|
||||
}
|
||||
|
@ -91,6 +91,6 @@ func LoadPlaylistDashboards(orgId, userId, playlistId int64) (dtos.PlaylistDashb
|
||||
result = append(result, k...)
|
||||
result = append(result, populateDashboardsByTag(orgId, userId, dashboardByTag, dashboardTagOrder)...)
|
||||
|
||||
sort.Sort(sort.Reverse(result))
|
||||
sort.Sort(result)
|
||||
return result, nil
|
||||
}
|
||||
|
@ -5,6 +5,7 @@ import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"net"
|
||||
"net/http"
|
||||
"net/http/httputil"
|
||||
"net/url"
|
||||
@ -71,7 +72,25 @@ func NewApiPluginProxy(ctx *middleware.Context, proxyPath string, route *plugins
|
||||
req.Header.Del("Cookie")
|
||||
req.Header.Del("Set-Cookie")
|
||||
|
||||
//Create a HTTP header with the context in it.
|
||||
// clear X-Forwarded Host/Port/Proto headers
|
||||
req.Header.Del("X-Forwarded-Host")
|
||||
req.Header.Del("X-Forwarded-Port")
|
||||
req.Header.Del("X-Forwarded-Proto")
|
||||
|
||||
// set X-Forwarded-For header
|
||||
if req.RemoteAddr != "" {
|
||||
remoteAddr, _, err := net.SplitHostPort(req.RemoteAddr)
|
||||
if err != nil {
|
||||
remoteAddr = req.RemoteAddr
|
||||
}
|
||||
if req.Header.Get("X-Forwarded-For") != "" {
|
||||
req.Header.Set("X-Forwarded-For", req.Header.Get("X-Forwarded-For")+", "+remoteAddr)
|
||||
} else {
|
||||
req.Header.Set("X-Forwarded-For", remoteAddr)
|
||||
}
|
||||
}
|
||||
|
||||
// Create a HTTP header with the context in it.
|
||||
ctxJson, err := json.Marshal(ctx.SignedInUser)
|
||||
if err != nil {
|
||||
ctx.JsonApiErr(500, "failed to marshal context to json.", err)
|
||||
@ -93,6 +112,8 @@ func NewApiPluginProxy(ctx *middleware.Context, proxyPath string, route *plugins
|
||||
}
|
||||
}
|
||||
|
||||
// reqBytes, _ := httputil.DumpRequestOut(req, true);
|
||||
// log.Trace("Proxying plugin request: %s", string(reqBytes))
|
||||
}
|
||||
|
||||
return &httputil.ReverseProxy{Director: director}
|
||||
|
149
pkg/components/dashdiffs/compare.go
Normal file
149
pkg/components/dashdiffs/compare.go
Normal file
@ -0,0 +1,149 @@
|
||||
package dashdiffs
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
diff "github.com/yudai/gojsondiff"
|
||||
deltaFormatter "github.com/yudai/gojsondiff/formatter"
|
||||
)
|
||||
|
||||
var (
|
||||
// ErrUnsupportedDiffType occurs when an invalid diff type is used.
|
||||
ErrUnsupportedDiffType = errors.New("dashdiff: unsupported diff type")
|
||||
|
||||
// ErrNilDiff occurs when two compared interfaces are identical.
|
||||
ErrNilDiff = errors.New("dashdiff: diff is nil")
|
||||
|
||||
diffLogger = log.New("dashdiffs")
|
||||
)
|
||||
|
||||
type DiffType int
|
||||
|
||||
const (
|
||||
DiffJSON DiffType = iota
|
||||
DiffBasic
|
||||
DiffDelta
|
||||
)
|
||||
|
||||
type Options struct {
|
||||
OrgId int64
|
||||
Base DiffTarget
|
||||
New DiffTarget
|
||||
DiffType DiffType
|
||||
}
|
||||
|
||||
type DiffTarget struct {
|
||||
DashboardId int64
|
||||
Version int
|
||||
UnsavedDashboard *simplejson.Json
|
||||
}
|
||||
|
||||
type Result struct {
|
||||
Delta []byte `json:"delta"`
|
||||
}
|
||||
|
||||
func ParseDiffType(diff string) DiffType {
|
||||
switch diff {
|
||||
case "json":
|
||||
return DiffJSON
|
||||
case "basic":
|
||||
return DiffBasic
|
||||
case "delta":
|
||||
return DiffDelta
|
||||
}
|
||||
return DiffBasic
|
||||
}
|
||||
|
||||
// CompareDashboardVersionsCommand computes the JSON diff of two versions,
|
||||
// assigning the delta of the diff to the `Delta` field.
|
||||
func CalculateDiff(options *Options) (*Result, error) {
|
||||
baseVersionQuery := models.GetDashboardVersionQuery{
|
||||
DashboardId: options.Base.DashboardId,
|
||||
Version: options.Base.Version,
|
||||
OrgId: options.OrgId,
|
||||
}
|
||||
|
||||
if err := bus.Dispatch(&baseVersionQuery); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
newVersionQuery := models.GetDashboardVersionQuery{
|
||||
DashboardId: options.New.DashboardId,
|
||||
Version: options.New.Version,
|
||||
OrgId: options.OrgId,
|
||||
}
|
||||
|
||||
if err := bus.Dispatch(&newVersionQuery); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
baseData := baseVersionQuery.Result.Data
|
||||
newData := newVersionQuery.Result.Data
|
||||
|
||||
left, jsonDiff, err := getDiff(baseData, newData)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
result := &Result{}
|
||||
|
||||
switch options.DiffType {
|
||||
case DiffDelta:
|
||||
|
||||
deltaOutput, err := deltaFormatter.NewDeltaFormatter().Format(jsonDiff)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
result.Delta = []byte(deltaOutput)
|
||||
|
||||
case DiffJSON:
|
||||
jsonOutput, err := NewJSONFormatter(left).Format(jsonDiff)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
result.Delta = []byte(jsonOutput)
|
||||
|
||||
case DiffBasic:
|
||||
basicOutput, err := NewBasicFormatter(left).Format(jsonDiff)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
result.Delta = basicOutput
|
||||
|
||||
default:
|
||||
return nil, ErrUnsupportedDiffType
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// getDiff computes the diff of two dashboard versions.
|
||||
func getDiff(baseData, newData *simplejson.Json) (interface{}, diff.Diff, error) {
|
||||
leftBytes, err := baseData.Encode()
|
||||
if err != nil {
|
||||
return nil, nil, err
|
||||
}
|
||||
|
||||
rightBytes, err := newData.Encode()
|
||||
if err != nil {
|
||||
return nil, nil, err
|
||||
}
|
||||
|
||||
jsonDiff, err := diff.New().Compare(leftBytes, rightBytes)
|
||||
if err != nil {
|
||||
return nil, nil, err
|
||||
}
|
||||
|
||||
if !jsonDiff.Modified() {
|
||||
return nil, nil, ErrNilDiff
|
||||
}
|
||||
|
||||
left := make(map[string]interface{})
|
||||
err = json.Unmarshal(leftBytes, &left)
|
||||
return left, jsonDiff, nil
|
||||
}
|
339
pkg/components/dashdiffs/formatter_basic.go
Normal file
339
pkg/components/dashdiffs/formatter_basic.go
Normal file
@ -0,0 +1,339 @@
|
||||
package dashdiffs
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"html/template"
|
||||
|
||||
diff "github.com/yudai/gojsondiff"
|
||||
)
|
||||
|
||||
// A BasicDiff holds the stateful values that are used when generating a basic
|
||||
// diff from JSON tokens.
|
||||
type BasicDiff struct {
|
||||
narrow string
|
||||
keysIdent int
|
||||
writing bool
|
||||
LastIndent int
|
||||
Block *BasicBlock
|
||||
Change *BasicChange
|
||||
Summary *BasicSummary
|
||||
}
|
||||
|
||||
// A BasicBlock represents a top-level element in a basic diff.
|
||||
type BasicBlock struct {
|
||||
Title string
|
||||
Old interface{}
|
||||
New interface{}
|
||||
Change ChangeType
|
||||
Changes []*BasicChange
|
||||
Summaries []*BasicSummary
|
||||
LineStart int
|
||||
LineEnd int
|
||||
}
|
||||
|
||||
// A BasicChange represents the change from an old to new value. There are many
|
||||
// BasicChanges in a BasicBlock.
|
||||
type BasicChange struct {
|
||||
Key string
|
||||
Old interface{}
|
||||
New interface{}
|
||||
Change ChangeType
|
||||
LineStart int
|
||||
LineEnd int
|
||||
}
|
||||
|
||||
// A BasicSummary represents the changes within a basic block that're too deep
|
||||
// or verbose to be represented in the top-level BasicBlock element, or in the
|
||||
// BasicChange. Instead of showing the values in this case, we simply print
|
||||
// the key and count how many times the given change was applied to that
|
||||
// element.
|
||||
type BasicSummary struct {
|
||||
Key string
|
||||
Change ChangeType
|
||||
Count int
|
||||
LineStart int
|
||||
LineEnd int
|
||||
}
|
||||
|
||||
type BasicFormatter struct {
|
||||
jsonDiff *JSONFormatter
|
||||
tpl *template.Template
|
||||
}
|
||||
|
||||
func NewBasicFormatter(left interface{}) *BasicFormatter {
|
||||
tpl := template.Must(template.New("block").Funcs(tplFuncMap).Parse(tplBlock))
|
||||
tpl = template.Must(tpl.New("change").Funcs(tplFuncMap).Parse(tplChange))
|
||||
tpl = template.Must(tpl.New("summary").Funcs(tplFuncMap).Parse(tplSummary))
|
||||
|
||||
return &BasicFormatter{
|
||||
jsonDiff: NewJSONFormatter(left),
|
||||
tpl: tpl,
|
||||
}
|
||||
}
|
||||
|
||||
func (b *BasicFormatter) Format(d diff.Diff) ([]byte, error) {
|
||||
// calling jsonDiff.Format(d) populates the JSON diff's "Lines" value,
|
||||
// which we use to compute the basic dif
|
||||
_, err := b.jsonDiff.Format(d)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
bd := &BasicDiff{}
|
||||
blocks := bd.Basic(b.jsonDiff.Lines)
|
||||
buf := &bytes.Buffer{}
|
||||
|
||||
err = b.tpl.ExecuteTemplate(buf, "block", blocks)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return buf.Bytes(), nil
|
||||
}
|
||||
|
||||
// Basic is V2 of the basic diff
|
||||
func (b *BasicDiff) Basic(lines []*JSONLine) []*BasicBlock {
|
||||
// init an array you can append to for the basic "blocks"
|
||||
blocks := make([]*BasicBlock, 0)
|
||||
|
||||
// iterate through each line
|
||||
for _, line := range lines {
|
||||
// TODO: this condition needs an explaination? what does it mean?
|
||||
if b.LastIndent == 2 && line.Indent == 1 && line.Change == ChangeNil {
|
||||
if b.Block != nil {
|
||||
blocks = append(blocks, b.Block)
|
||||
}
|
||||
}
|
||||
|
||||
b.LastIndent = line.Indent
|
||||
|
||||
// TODO: why special handling for indent 2?
|
||||
if line.Indent == 1 {
|
||||
switch line.Change {
|
||||
case ChangeNil:
|
||||
if line.Change == ChangeNil {
|
||||
if line.Key != "" {
|
||||
b.Block = &BasicBlock{
|
||||
Title: line.Key,
|
||||
Change: line.Change,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
case ChangeAdded, ChangeDeleted:
|
||||
blocks = append(blocks, &BasicBlock{
|
||||
Title: line.Key,
|
||||
Change: line.Change,
|
||||
New: line.Val,
|
||||
LineStart: line.LineNum,
|
||||
})
|
||||
|
||||
case ChangeOld:
|
||||
b.Block = &BasicBlock{
|
||||
Title: line.Key,
|
||||
Old: line.Val,
|
||||
Change: line.Change,
|
||||
LineStart: line.LineNum,
|
||||
}
|
||||
|
||||
case ChangeNew:
|
||||
b.Block.New = line.Val
|
||||
b.Block.LineEnd = line.LineNum
|
||||
|
||||
// then write out the change
|
||||
blocks = append(blocks, b.Block)
|
||||
default:
|
||||
// ok
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: why special handling for indent > 2 ?
|
||||
// Other Lines
|
||||
if line.Indent > 1 {
|
||||
// Ensure single line change
|
||||
if line.Key != "" && line.Val != nil && !b.writing {
|
||||
switch line.Change {
|
||||
case ChangeAdded, ChangeDeleted:
|
||||
|
||||
b.Block.Changes = append(b.Block.Changes, &BasicChange{
|
||||
Key: line.Key,
|
||||
Change: line.Change,
|
||||
New: line.Val,
|
||||
LineStart: line.LineNum,
|
||||
})
|
||||
|
||||
case ChangeOld:
|
||||
b.Change = &BasicChange{
|
||||
Key: line.Key,
|
||||
Change: line.Change,
|
||||
Old: line.Val,
|
||||
LineStart: line.LineNum,
|
||||
}
|
||||
|
||||
case ChangeNew:
|
||||
b.Change.New = line.Val
|
||||
b.Change.LineEnd = line.LineNum
|
||||
b.Block.Changes = append(b.Block.Changes, b.Change)
|
||||
|
||||
default:
|
||||
//ok
|
||||
}
|
||||
|
||||
} else {
|
||||
if line.Change != ChangeUnchanged {
|
||||
if line.Key != "" {
|
||||
b.narrow = line.Key
|
||||
b.keysIdent = line.Indent
|
||||
}
|
||||
|
||||
if line.Change != ChangeNil {
|
||||
if !b.writing {
|
||||
b.writing = true
|
||||
key := b.Block.Title
|
||||
|
||||
if b.narrow != "" {
|
||||
key = b.narrow
|
||||
if b.keysIdent > line.Indent {
|
||||
key = b.Block.Title
|
||||
}
|
||||
}
|
||||
|
||||
b.Summary = &BasicSummary{
|
||||
Key: key,
|
||||
Change: line.Change,
|
||||
LineStart: line.LineNum,
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if b.writing {
|
||||
b.writing = false
|
||||
b.Summary.LineEnd = line.LineNum
|
||||
b.Block.Summaries = append(b.Block.Summaries, b.Summary)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return blocks
|
||||
}
|
||||
|
||||
// encStateMap is used in the template helper
|
||||
var (
|
||||
encStateMap = map[ChangeType]string{
|
||||
ChangeAdded: "added",
|
||||
ChangeDeleted: "deleted",
|
||||
ChangeOld: "changed",
|
||||
ChangeNew: "changed",
|
||||
}
|
||||
|
||||
// tplFuncMap is the function map for each template
|
||||
tplFuncMap = template.FuncMap{
|
||||
"getChange": func(c ChangeType) string {
|
||||
state, ok := encStateMap[c]
|
||||
if !ok {
|
||||
return "changed"
|
||||
}
|
||||
return state
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
var (
|
||||
// tplBlock is the whole thing
|
||||
tplBlock = `{{ define "block" -}}
|
||||
{{ range . }}
|
||||
<div class="diff-group">
|
||||
<div class="diff-block">
|
||||
<h2 class="diff-block-title">
|
||||
<i class="diff-circle diff-circle-{{ getChange .Change }} fa fa-circle"></i>
|
||||
<strong class="diff-title">{{ .Title }}</strong> {{ getChange .Change }}
|
||||
</h2>
|
||||
|
||||
|
||||
<!-- Overview -->
|
||||
{{ if .Old }}
|
||||
<div class="diff-label">{{ .Old }}</div>
|
||||
<i class="diff-arrow fa fa-long-arrow-right"></i>
|
||||
{{ end }}
|
||||
{{ if .New }}
|
||||
<div class="diff-label">{{ .New }}</div>
|
||||
{{ end }}
|
||||
|
||||
{{ if .LineStart }}
|
||||
<diff-link-json
|
||||
line-link="{{ .LineStart }}"
|
||||
line-display="{{ .LineStart }}{{ if .LineEnd }} - {{ .LineEnd }}{{ end }}"
|
||||
switch-view="ctrl.getDiff('html')"
|
||||
/>
|
||||
{{ end }}
|
||||
|
||||
</div>
|
||||
|
||||
<!-- Basic Changes -->
|
||||
{{ range .Changes }}
|
||||
<ul class="diff-change-container">
|
||||
{{ template "change" . }}
|
||||
</ul>
|
||||
{{ end }}
|
||||
|
||||
<!-- Basic Summary -->
|
||||
{{ range .Summaries }}
|
||||
{{ template "summary" . }}
|
||||
{{ end }}
|
||||
|
||||
</div>
|
||||
{{ end }}
|
||||
{{ end }}`
|
||||
|
||||
// tplChange is the template for changes
|
||||
tplChange = `{{ define "change" -}}
|
||||
<li class="diff-change-group">
|
||||
<span class="bullet-position-container">
|
||||
<div class="diff-change-item diff-change-title">{{ getChange .Change }} {{ .Key }}</div>
|
||||
|
||||
<div class="diff-change-item">
|
||||
{{ if .Old }}
|
||||
<div class="diff-label">{{ .Old }}</div>
|
||||
<i class="diff-arrow fa fa-long-arrow-right"></i>
|
||||
{{ end }}
|
||||
{{ if .New }}
|
||||
<div class="diff-label">{{ .New }}</div>
|
||||
{{ end }}
|
||||
</div>
|
||||
|
||||
{{ if .LineStart }}
|
||||
<diff-link-json
|
||||
line-link="{{ .LineStart }}"
|
||||
line-display="{{ .LineStart }}{{ if .LineEnd }} - {{ .LineEnd }}{{ end }}"
|
||||
switch-view="ctrl.getDiff('json')"
|
||||
/>
|
||||
{{ end }}
|
||||
</span>
|
||||
</li>
|
||||
{{ end }}`
|
||||
|
||||
// tplSummary is for basis summaries
|
||||
tplSummary = `{{ define "summary" -}}
|
||||
<div class="diff-group-name">
|
||||
<i class="diff-circle diff-circle-{{ getChange .Change }} fa fa-circle-o diff-list-circle"></i>
|
||||
|
||||
{{ if .Count }}
|
||||
<strong>{{ .Count }}</strong>
|
||||
{{ end }}
|
||||
|
||||
{{ if .Key }}
|
||||
<strong class="diff-summary-key">{{ .Key }}</strong>
|
||||
{{ getChange .Change }}
|
||||
{{ end }}
|
||||
|
||||
{{ if .LineStart }}
|
||||
<diff-link-json
|
||||
line-link="{{ .LineStart }}"
|
||||
line-display="{{ .LineStart }}{{ if .LineEnd }} - {{ .LineEnd }}{{ end }}"
|
||||
switch-view="ctrl.getDiff('json')"
|
||||
/>
|
||||
{{ end }}
|
||||
</div>
|
||||
{{ end }}`
|
||||
)
|
477
pkg/components/dashdiffs/formatter_json.go
Normal file
477
pkg/components/dashdiffs/formatter_json.go
Normal file
@ -0,0 +1,477 @@
|
||||
package dashdiffs
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"errors"
|
||||
"fmt"
|
||||
"html/template"
|
||||
"sort"
|
||||
|
||||
diff "github.com/yudai/gojsondiff"
|
||||
)
|
||||
|
||||
type ChangeType int
|
||||
|
||||
const (
|
||||
ChangeNil ChangeType = iota
|
||||
ChangeAdded
|
||||
ChangeDeleted
|
||||
ChangeOld
|
||||
ChangeNew
|
||||
ChangeUnchanged
|
||||
)
|
||||
|
||||
var (
|
||||
// changeTypeToSymbol is used for populating the terminating characer in
|
||||
// the diff
|
||||
changeTypeToSymbol = map[ChangeType]string{
|
||||
ChangeNil: "",
|
||||
ChangeAdded: "+",
|
||||
ChangeDeleted: "-",
|
||||
ChangeOld: "-",
|
||||
ChangeNew: "+",
|
||||
}
|
||||
|
||||
// changeTypeToName is used for populating class names in the diff
|
||||
changeTypeToName = map[ChangeType]string{
|
||||
ChangeNil: "same",
|
||||
ChangeAdded: "added",
|
||||
ChangeDeleted: "deleted",
|
||||
ChangeOld: "old",
|
||||
ChangeNew: "new",
|
||||
}
|
||||
)
|
||||
|
||||
var (
|
||||
// tplJSONDiffWrapper is the template that wraps a diff
|
||||
tplJSONDiffWrapper = `{{ define "JSONDiffWrapper" -}}
|
||||
{{ range $index, $element := . }}
|
||||
{{ template "JSONDiffLine" $element }}
|
||||
{{ end }}
|
||||
{{ end }}`
|
||||
|
||||
// tplJSONDiffLine is the template that prints each line in a diff
|
||||
tplJSONDiffLine = `{{ define "JSONDiffLine" -}}
|
||||
<p id="l{{ .LineNum }}" class="diff-line diff-json-{{ cton .Change }}">
|
||||
<span class="diff-line-number">
|
||||
{{if .LeftLine }}{{ .LeftLine }}{{ end }}
|
||||
</span>
|
||||
<span class="diff-line-number">
|
||||
{{if .RightLine }}{{ .RightLine }}{{ end }}
|
||||
</span>
|
||||
<span class="diff-value diff-indent-{{ .Indent }}" title="{{ .Text }}">
|
||||
{{ .Text }}
|
||||
</span>
|
||||
<span class="diff-line-icon">{{ ctos .Change }}</span>
|
||||
</p>
|
||||
{{ end }}`
|
||||
)
|
||||
|
||||
var diffTplFuncs = template.FuncMap{
|
||||
"ctos": func(c ChangeType) string {
|
||||
if symbol, ok := changeTypeToSymbol[c]; ok {
|
||||
return symbol
|
||||
}
|
||||
return ""
|
||||
},
|
||||
"cton": func(c ChangeType) string {
|
||||
if name, ok := changeTypeToName[c]; ok {
|
||||
return name
|
||||
}
|
||||
return ""
|
||||
},
|
||||
}
|
||||
|
||||
// JSONLine contains the data required to render each line of the JSON diff
|
||||
// and contains the data required to produce the tokens output in the basic
|
||||
// diff.
|
||||
type JSONLine struct {
|
||||
LineNum int `json:"line"`
|
||||
LeftLine int `json:"leftLine"`
|
||||
RightLine int `json:"rightLine"`
|
||||
Indent int `json:"indent"`
|
||||
Text string `json:"text"`
|
||||
Change ChangeType `json:"changeType"`
|
||||
Key string `json:"key"`
|
||||
Val interface{} `json:"value"`
|
||||
}
|
||||
|
||||
func NewJSONFormatter(left interface{}) *JSONFormatter {
|
||||
tpl := template.Must(template.New("JSONDiffWrapper").Funcs(diffTplFuncs).Parse(tplJSONDiffWrapper))
|
||||
tpl = template.Must(tpl.New("JSONDiffLine").Funcs(diffTplFuncs).Parse(tplJSONDiffLine))
|
||||
|
||||
return &JSONFormatter{
|
||||
left: left,
|
||||
Lines: []*JSONLine{},
|
||||
tpl: tpl,
|
||||
path: []string{},
|
||||
size: []int{},
|
||||
lineCount: 0,
|
||||
inArray: []bool{},
|
||||
}
|
||||
}
|
||||
|
||||
type JSONFormatter struct {
|
||||
left interface{}
|
||||
path []string
|
||||
size []int
|
||||
inArray []bool
|
||||
lineCount int
|
||||
leftLine int
|
||||
rightLine int
|
||||
line *AsciiLine
|
||||
Lines []*JSONLine
|
||||
tpl *template.Template
|
||||
}
|
||||
|
||||
type AsciiLine struct {
|
||||
// the type of change
|
||||
change ChangeType
|
||||
|
||||
// the actual changes - no formatting
|
||||
key string
|
||||
val interface{}
|
||||
|
||||
// level of indentation for the current line
|
||||
indent int
|
||||
|
||||
// buffer containing the fully formatted line
|
||||
buffer *bytes.Buffer
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) Format(diff diff.Diff) (result string, err error) {
|
||||
if v, ok := f.left.(map[string]interface{}); ok {
|
||||
f.formatObject(v, diff)
|
||||
} else if v, ok := f.left.([]interface{}); ok {
|
||||
f.formatArray(v, diff)
|
||||
} else {
|
||||
return "", fmt.Errorf("expected map[string]interface{} or []interface{}, got %T",
|
||||
f.left)
|
||||
}
|
||||
|
||||
b := &bytes.Buffer{}
|
||||
err = f.tpl.ExecuteTemplate(b, "JSONDiffWrapper", f.Lines)
|
||||
if err != nil {
|
||||
fmt.Printf("%v\n", err)
|
||||
return "", err
|
||||
}
|
||||
return b.String(), nil
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) formatObject(left map[string]interface{}, df diff.Diff) {
|
||||
f.addLineWith(ChangeNil, "{")
|
||||
f.push("ROOT", len(left), false)
|
||||
f.processObject(left, df.Deltas())
|
||||
f.pop()
|
||||
f.addLineWith(ChangeNil, "}")
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) formatArray(left []interface{}, df diff.Diff) {
|
||||
f.addLineWith(ChangeNil, "[")
|
||||
f.push("ROOT", len(left), true)
|
||||
f.processArray(left, df.Deltas())
|
||||
f.pop()
|
||||
f.addLineWith(ChangeNil, "]")
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) processArray(array []interface{}, deltas []diff.Delta) error {
|
||||
patchedIndex := 0
|
||||
for index, value := range array {
|
||||
f.processItem(value, deltas, diff.Index(index))
|
||||
patchedIndex++
|
||||
}
|
||||
|
||||
// additional Added
|
||||
for _, delta := range deltas {
|
||||
switch delta.(type) {
|
||||
case *diff.Added:
|
||||
d := delta.(*diff.Added)
|
||||
// skip items already processed
|
||||
if int(d.Position.(diff.Index)) < len(array) {
|
||||
continue
|
||||
}
|
||||
f.printRecursive(d.Position.String(), d.Value, ChangeAdded)
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) processObject(object map[string]interface{}, deltas []diff.Delta) error {
|
||||
names := sortKeys(object)
|
||||
for _, name := range names {
|
||||
value := object[name]
|
||||
f.processItem(value, deltas, diff.Name(name))
|
||||
}
|
||||
|
||||
// Added
|
||||
for _, delta := range deltas {
|
||||
switch delta.(type) {
|
||||
case *diff.Added:
|
||||
d := delta.(*diff.Added)
|
||||
f.printRecursive(d.Position.String(), d.Value, ChangeAdded)
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) processItem(value interface{}, deltas []diff.Delta, position diff.Position) error {
|
||||
matchedDeltas := f.searchDeltas(deltas, position)
|
||||
positionStr := position.String()
|
||||
if len(matchedDeltas) > 0 {
|
||||
for _, matchedDelta := range matchedDeltas {
|
||||
|
||||
switch matchedDelta.(type) {
|
||||
case *diff.Object:
|
||||
d := matchedDelta.(*diff.Object)
|
||||
switch value.(type) {
|
||||
case map[string]interface{}:
|
||||
//ok
|
||||
default:
|
||||
return errors.New("Type mismatch")
|
||||
}
|
||||
o := value.(map[string]interface{})
|
||||
|
||||
f.newLine(ChangeNil)
|
||||
f.printKey(positionStr)
|
||||
f.print("{")
|
||||
f.closeLine()
|
||||
f.push(positionStr, len(o), false)
|
||||
f.processObject(o, d.Deltas)
|
||||
f.pop()
|
||||
f.newLine(ChangeNil)
|
||||
f.print("}")
|
||||
f.printComma()
|
||||
f.closeLine()
|
||||
|
||||
case *diff.Array:
|
||||
d := matchedDelta.(*diff.Array)
|
||||
switch value.(type) {
|
||||
case []interface{}:
|
||||
//ok
|
||||
default:
|
||||
return errors.New("Type mismatch")
|
||||
}
|
||||
a := value.([]interface{})
|
||||
|
||||
f.newLine(ChangeNil)
|
||||
f.printKey(positionStr)
|
||||
f.print("[")
|
||||
f.closeLine()
|
||||
f.push(positionStr, len(a), true)
|
||||
f.processArray(a, d.Deltas)
|
||||
f.pop()
|
||||
f.newLine(ChangeNil)
|
||||
f.print("]")
|
||||
f.printComma()
|
||||
f.closeLine()
|
||||
|
||||
case *diff.Added:
|
||||
d := matchedDelta.(*diff.Added)
|
||||
f.printRecursive(positionStr, d.Value, ChangeAdded)
|
||||
f.size[len(f.size)-1]++
|
||||
|
||||
case *diff.Modified:
|
||||
d := matchedDelta.(*diff.Modified)
|
||||
savedSize := f.size[len(f.size)-1]
|
||||
f.printRecursive(positionStr, d.OldValue, ChangeOld)
|
||||
f.size[len(f.size)-1] = savedSize
|
||||
f.printRecursive(positionStr, d.NewValue, ChangeNew)
|
||||
|
||||
case *diff.TextDiff:
|
||||
savedSize := f.size[len(f.size)-1]
|
||||
d := matchedDelta.(*diff.TextDiff)
|
||||
f.printRecursive(positionStr, d.OldValue, ChangeOld)
|
||||
f.size[len(f.size)-1] = savedSize
|
||||
f.printRecursive(positionStr, d.NewValue, ChangeNew)
|
||||
|
||||
case *diff.Deleted:
|
||||
d := matchedDelta.(*diff.Deleted)
|
||||
f.printRecursive(positionStr, d.Value, ChangeDeleted)
|
||||
|
||||
default:
|
||||
return errors.New("Unknown Delta type detected")
|
||||
}
|
||||
|
||||
}
|
||||
} else {
|
||||
f.printRecursive(positionStr, value, ChangeUnchanged)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) searchDeltas(deltas []diff.Delta, postion diff.Position) (results []diff.Delta) {
|
||||
results = make([]diff.Delta, 0)
|
||||
for _, delta := range deltas {
|
||||
switch delta.(type) {
|
||||
case diff.PostDelta:
|
||||
if delta.(diff.PostDelta).PostPosition() == postion {
|
||||
results = append(results, delta)
|
||||
}
|
||||
case diff.PreDelta:
|
||||
if delta.(diff.PreDelta).PrePosition() == postion {
|
||||
results = append(results, delta)
|
||||
}
|
||||
default:
|
||||
panic("heh")
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) push(name string, size int, array bool) {
|
||||
f.path = append(f.path, name)
|
||||
f.size = append(f.size, size)
|
||||
f.inArray = append(f.inArray, array)
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) pop() {
|
||||
f.path = f.path[0 : len(f.path)-1]
|
||||
f.size = f.size[0 : len(f.size)-1]
|
||||
f.inArray = f.inArray[0 : len(f.inArray)-1]
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) addLineWith(change ChangeType, value string) {
|
||||
f.line = &AsciiLine{
|
||||
change: change,
|
||||
indent: len(f.path),
|
||||
buffer: bytes.NewBufferString(value),
|
||||
}
|
||||
f.closeLine()
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) newLine(change ChangeType) {
|
||||
f.line = &AsciiLine{
|
||||
change: change,
|
||||
indent: len(f.path),
|
||||
buffer: bytes.NewBuffer([]byte{}),
|
||||
}
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) closeLine() {
|
||||
leftLine := 0
|
||||
rightLine := 0
|
||||
f.lineCount++
|
||||
|
||||
switch f.line.change {
|
||||
case ChangeAdded, ChangeNew:
|
||||
f.rightLine++
|
||||
rightLine = f.rightLine
|
||||
|
||||
case ChangeDeleted, ChangeOld:
|
||||
f.leftLine++
|
||||
leftLine = f.leftLine
|
||||
|
||||
case ChangeNil, ChangeUnchanged:
|
||||
f.rightLine++
|
||||
f.leftLine++
|
||||
rightLine = f.rightLine
|
||||
leftLine = f.leftLine
|
||||
}
|
||||
|
||||
s := f.line.buffer.String()
|
||||
f.Lines = append(f.Lines, &JSONLine{
|
||||
LineNum: f.lineCount,
|
||||
RightLine: rightLine,
|
||||
LeftLine: leftLine,
|
||||
Indent: f.line.indent,
|
||||
Text: s,
|
||||
Change: f.line.change,
|
||||
Key: f.line.key,
|
||||
Val: f.line.val,
|
||||
})
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) printKey(name string) {
|
||||
if !f.inArray[len(f.inArray)-1] {
|
||||
f.line.key = name
|
||||
fmt.Fprintf(f.line.buffer, `"%s": `, name)
|
||||
}
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) printComma() {
|
||||
f.size[len(f.size)-1]--
|
||||
if f.size[len(f.size)-1] > 0 {
|
||||
f.line.buffer.WriteRune(',')
|
||||
}
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) printValue(value interface{}) {
|
||||
switch value.(type) {
|
||||
case string:
|
||||
f.line.val = value
|
||||
fmt.Fprintf(f.line.buffer, `"%s"`, value)
|
||||
case nil:
|
||||
f.line.val = "null"
|
||||
f.line.buffer.WriteString("null")
|
||||
default:
|
||||
f.line.val = value
|
||||
fmt.Fprintf(f.line.buffer, `%#v`, value)
|
||||
}
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) print(a string) {
|
||||
f.line.buffer.WriteString(a)
|
||||
}
|
||||
|
||||
func (f *JSONFormatter) printRecursive(name string, value interface{}, change ChangeType) {
|
||||
switch value.(type) {
|
||||
case map[string]interface{}:
|
||||
f.newLine(change)
|
||||
f.printKey(name)
|
||||
f.print("{")
|
||||
f.closeLine()
|
||||
|
||||
m := value.(map[string]interface{})
|
||||
size := len(m)
|
||||
f.push(name, size, false)
|
||||
|
||||
keys := sortKeys(m)
|
||||
for _, key := range keys {
|
||||
f.printRecursive(key, m[key], change)
|
||||
}
|
||||
f.pop()
|
||||
|
||||
f.newLine(change)
|
||||
f.print("}")
|
||||
f.printComma()
|
||||
f.closeLine()
|
||||
|
||||
case []interface{}:
|
||||
f.newLine(change)
|
||||
f.printKey(name)
|
||||
f.print("[")
|
||||
f.closeLine()
|
||||
|
||||
s := value.([]interface{})
|
||||
size := len(s)
|
||||
f.push("", size, true)
|
||||
for _, item := range s {
|
||||
f.printRecursive("", item, change)
|
||||
}
|
||||
f.pop()
|
||||
|
||||
f.newLine(change)
|
||||
f.print("]")
|
||||
f.printComma()
|
||||
f.closeLine()
|
||||
|
||||
default:
|
||||
f.newLine(change)
|
||||
f.printKey(name)
|
||||
f.printValue(value)
|
||||
f.printComma()
|
||||
f.closeLine()
|
||||
}
|
||||
}
|
||||
|
||||
func sortKeys(m map[string]interface{}) (keys []string) {
|
||||
keys = make([]string, 0, len(m))
|
||||
for key := range m {
|
||||
keys = append(keys, key)
|
||||
}
|
||||
sort.Strings(keys)
|
||||
return
|
||||
}
|
@ -78,5 +78,9 @@ func (u *S3Uploader) Upload(imageDiskPath string) (string, error) {
|
||||
return "", err
|
||||
}
|
||||
|
||||
return "https://" + u.bucket + ".s3-" + u.region + ".amazonaws.com/" + key, nil
|
||||
if u.region == "us-east-1" {
|
||||
return "https://" + u.bucket + ".s3.amazonaws.com/" + key, nil
|
||||
} else {
|
||||
return "https://" + u.bucket + ".s3-" + u.region + ".amazonaws.com/" + key, nil
|
||||
}
|
||||
}
|
||||
|
@ -21,6 +21,7 @@ type WebdavUploader struct {
|
||||
}
|
||||
|
||||
var netTransport = &http.Transport{
|
||||
Proxy: http.ProxyFromEnvironment,
|
||||
Dial: (&net.Dialer{
|
||||
Timeout: 60 * time.Second,
|
||||
}).Dial,
|
||||
|
@ -124,7 +124,7 @@ func (m *StandardMeter) Count() int64 {
|
||||
return count
|
||||
}
|
||||
|
||||
// Mark records the occurance of n events.
|
||||
// Mark records the occurrence of n events.
|
||||
func (m *StandardMeter) Mark(n int64) {
|
||||
m.lock.Lock()
|
||||
defer m.lock.Unlock()
|
||||
|
@ -49,9 +49,9 @@ func Logger() macaron.Handler {
|
||||
if ctx, ok := c.Data["ctx"]; ok {
|
||||
ctxTyped := ctx.(*Context)
|
||||
if status == 500 {
|
||||
ctxTyped.Logger.Error("Request Completed", "method", req.Method, "path", req.URL.Path, "status", status, "remote_addr", c.RemoteAddr(), "time_ms", int64(timeTakenMs), "size", rw.Size())
|
||||
ctxTyped.Logger.Error("Request Completed", "method", req.Method, "path", req.URL.Path, "status", status, "remote_addr", c.RemoteAddr(), "time_ms", int64(timeTakenMs), "size", rw.Size(), "referer", req.Referer())
|
||||
} else {
|
||||
ctxTyped.Logger.Info("Request Completed", "method", req.Method, "path", req.URL.Path, "status", status, "remote_addr", c.RemoteAddr(), "time_ms", int64(timeTakenMs), "size", rw.Size())
|
||||
ctxTyped.Logger.Info("Request Completed", "method", req.Method, "path", req.URL.Path, "status", status, "remote_addr", c.RemoteAddr(), "time_ms", int64(timeTakenMs), "size", rw.Size(), "referer", req.Referer())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
71
pkg/models/dashboard_version.go
Normal file
71
pkg/models/dashboard_version.go
Normal file
@ -0,0 +1,71 @@
|
||||
package models
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
)
|
||||
|
||||
var (
|
||||
ErrDashboardVersionNotFound = errors.New("Dashboard version not found")
|
||||
ErrNoVersionsForDashboardId = errors.New("No dashboard versions found for the given DashboardId")
|
||||
)
|
||||
|
||||
// A DashboardVersion represents the comparable data in a dashboard, allowing
|
||||
// diffs of the dashboard to be performed.
|
||||
type DashboardVersion struct {
|
||||
Id int64 `json:"id"`
|
||||
DashboardId int64 `json:"dashboardId"`
|
||||
ParentVersion int `json:"parentVersion"`
|
||||
RestoredFrom int `json:"restoredFrom"`
|
||||
Version int `json:"version"`
|
||||
|
||||
Created time.Time `json:"created"`
|
||||
CreatedBy int64 `json:"createdBy"`
|
||||
|
||||
Message string `json:"message"`
|
||||
Data *simplejson.Json `json:"data"`
|
||||
}
|
||||
|
||||
// DashboardVersionMeta extends the dashboard version model with the names
|
||||
// associated with the UserIds, overriding the field with the same name from
|
||||
// the DashboardVersion model.
|
||||
type DashboardVersionMeta struct {
|
||||
DashboardVersion
|
||||
CreatedBy string `json:"createdBy"`
|
||||
}
|
||||
|
||||
// DashboardVersionDTO represents a dashboard version, without the dashboard
|
||||
// map.
|
||||
type DashboardVersionDTO struct {
|
||||
Id int64 `json:"id"`
|
||||
DashboardId int64 `json:"dashboardId"`
|
||||
ParentVersion int `json:"parentVersion"`
|
||||
RestoredFrom int `json:"restoredFrom"`
|
||||
Version int `json:"version"`
|
||||
Created time.Time `json:"created"`
|
||||
CreatedBy string `json:"createdBy"`
|
||||
Message string `json:"message"`
|
||||
}
|
||||
|
||||
//
|
||||
// Queries
|
||||
//
|
||||
|
||||
type GetDashboardVersionQuery struct {
|
||||
DashboardId int64
|
||||
OrgId int64
|
||||
Version int
|
||||
|
||||
Result *DashboardVersion
|
||||
}
|
||||
|
||||
type GetDashboardVersionsQuery struct {
|
||||
DashboardId int64
|
||||
OrgId int64
|
||||
Limit int
|
||||
Start int
|
||||
|
||||
Result []*DashboardVersionDTO
|
||||
}
|
@ -98,12 +98,17 @@ func NewDashboardFromJson(data *simplejson.Json) *Dashboard {
|
||||
// GetDashboardModel turns the command into the savable model
|
||||
func (cmd *SaveDashboardCommand) GetDashboardModel() *Dashboard {
|
||||
dash := NewDashboardFromJson(cmd.Dashboard)
|
||||
userId := cmd.UserId
|
||||
|
||||
if dash.Data.Get("version").MustInt(0) == 0 {
|
||||
dash.CreatedBy = cmd.UserId
|
||||
if userId == 0 {
|
||||
userId = -1
|
||||
}
|
||||
|
||||
dash.UpdatedBy = cmd.UserId
|
||||
if dash.Data.Get("version").MustInt(0) == 0 {
|
||||
dash.CreatedBy = userId
|
||||
}
|
||||
|
||||
dash.UpdatedBy = userId
|
||||
dash.OrgId = cmd.OrgId
|
||||
dash.PluginId = cmd.PluginId
|
||||
dash.UpdateSlug()
|
||||
@ -126,11 +131,13 @@ func (dash *Dashboard) UpdateSlug() {
|
||||
//
|
||||
|
||||
type SaveDashboardCommand struct {
|
||||
Dashboard *simplejson.Json `json:"dashboard" binding:"Required"`
|
||||
UserId int64 `json:"userId"`
|
||||
OrgId int64 `json:"-"`
|
||||
Overwrite bool `json:"overwrite"`
|
||||
PluginId string `json:"-"`
|
||||
Dashboard *simplejson.Json `json:"dashboard" binding:"Required"`
|
||||
UserId int64 `json:"userId"`
|
||||
Overwrite bool `json:"overwrite"`
|
||||
Message string `json:"message"`
|
||||
OrgId int64 `json:"-"`
|
||||
RestoredFrom int `json:"-"`
|
||||
PluginId string `json:"-"`
|
||||
|
||||
Result *Dashboard
|
||||
}
|
||||
@ -145,7 +152,8 @@ type DeleteDashboardCommand struct {
|
||||
//
|
||||
|
||||
type GetDashboardQuery struct {
|
||||
Slug string
|
||||
Slug string // required if no Id is specified
|
||||
Id int64 // optional if slug is set
|
||||
OrgId int64
|
||||
|
||||
Result *Dashboard
|
||||
|
@ -7,5 +7,5 @@ const (
|
||||
GOOGLE
|
||||
TWITTER
|
||||
GENERIC
|
||||
GRAFANANET
|
||||
GRAFANA_COM
|
||||
)
|
||||
|
@ -89,7 +89,7 @@ func (e *DashAlertExtractor) GetAlerts() ([]*m.Alert, error) {
|
||||
continue
|
||||
}
|
||||
|
||||
// backward compatability check, can be removed later
|
||||
// backward compatibility check, can be removed later
|
||||
enabled, hasEnabled := jsonAlert.CheckGet("enabled")
|
||||
if hasEnabled && enabled.MustBool() == false {
|
||||
continue
|
||||
|
@ -69,6 +69,11 @@ func (this *EmailNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
return err
|
||||
}
|
||||
|
||||
error := ""
|
||||
if evalContext.Error != nil {
|
||||
error = evalContext.Error.Error()
|
||||
}
|
||||
|
||||
cmd := &m.SendEmailCommandSync{
|
||||
SendEmailCommand: m.SendEmailCommand{
|
||||
Subject: evalContext.GetNotificationTitle(),
|
||||
@ -78,6 +83,7 @@ func (this *EmailNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
"Name": evalContext.Rule.Name,
|
||||
"StateModel": evalContext.GetStateModel(),
|
||||
"Message": evalContext.Rule.Message,
|
||||
"Error": error,
|
||||
"RuleUrl": ruleUrl,
|
||||
"ImageLink": "",
|
||||
"EmbededImage": "",
|
||||
|
@ -1,14 +1,15 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/services/alerting"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func init() {
|
||||
@ -23,6 +24,14 @@ func init() {
|
||||
<span class="gf-form-label width-10">Url</span>
|
||||
<input type="text" required class="gf-form-input max-width-26" ng-model="ctrl.model.settings.url" placeholder="http://sensu-api.local:4567/results"></input>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-10">Source</span>
|
||||
<input type="text" class="gf-form-input max-width-14" ng-model="ctrl.model.settings.source" bs-tooltip="'If empty rule id will be used'" data-placement="right"></input>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-10">Handler</span>
|
||||
<input type="text" class="gf-form-input max-width-14" ng-model="ctrl.model.settings.handler" placeholder="default"></input>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label width-10">Username</span>
|
||||
<input type="text" class="gf-form-input max-width-14" ng-model="ctrl.model.settings.username"></input>
|
||||
@ -46,7 +55,9 @@ func NewSensuNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
|
||||
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||
Url: url,
|
||||
User: model.Settings.Get("username").MustString(),
|
||||
Source: model.Settings.Get("source").MustString(),
|
||||
Password: model.Settings.Get("password").MustString(),
|
||||
Handler: model.Settings.Get("handler").MustString(),
|
||||
log: log.New("alerting.notifier.sensu"),
|
||||
}, nil
|
||||
}
|
||||
@ -54,8 +65,10 @@ func NewSensuNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
|
||||
type SensuNotifier struct {
|
||||
NotifierBase
|
||||
Url string
|
||||
Source string
|
||||
User string
|
||||
Password string
|
||||
Handler string
|
||||
log log.Logger
|
||||
}
|
||||
|
||||
@ -67,9 +80,13 @@ func (this *SensuNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
bodyJSON.Set("ruleId", evalContext.Rule.Id)
|
||||
// Sensu alerts cannot have spaces in them
|
||||
bodyJSON.Set("name", strings.Replace(evalContext.Rule.Name, " ", "_", -1))
|
||||
// Sensu alerts require a command
|
||||
// We set it to the grafana ruleID
|
||||
bodyJSON.Set("source", "grafana_rule_"+strconv.FormatInt(evalContext.Rule.Id, 10))
|
||||
// Sensu alerts require a source. We set it to the user-specified value (optional),
|
||||
// else we fallback and use the grafana ruleID.
|
||||
if this.Source != "" {
|
||||
bodyJSON.Set("source", this.Source)
|
||||
} else {
|
||||
bodyJSON.Set("source", "grafana_rule_"+strconv.FormatInt(evalContext.Rule.Id, 10))
|
||||
}
|
||||
// Finally, sensu expects an output
|
||||
// We set it to a default output
|
||||
bodyJSON.Set("output", "Grafana Metric Condition Met")
|
||||
@ -83,6 +100,10 @@ func (this *SensuNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
bodyJSON.Set("status", 0)
|
||||
}
|
||||
|
||||
if this.Handler != "" {
|
||||
bodyJSON.Set("handler", this.Handler)
|
||||
}
|
||||
|
||||
ruleUrl, err := evalContext.GetRuleUrl()
|
||||
if err == nil {
|
||||
bodyJSON.Set("ruleUrl", ruleUrl)
|
||||
|
@ -29,7 +29,9 @@ func TestSensuNotifier(t *testing.T) {
|
||||
Convey("from settings", func() {
|
||||
json := `
|
||||
{
|
||||
"url": "http://sensu-api.example.com:4567/results"
|
||||
"url": "http://sensu-api.example.com:4567/results",
|
||||
"source": "grafana_instance_01",
|
||||
"handler": "myhandler"
|
||||
}`
|
||||
|
||||
settingsJSON, _ := simplejson.NewJson([]byte(json))
|
||||
@ -46,6 +48,8 @@ func TestSensuNotifier(t *testing.T) {
|
||||
So(sensuNotifier.Name, ShouldEqual, "sensu")
|
||||
So(sensuNotifier.Type, ShouldEqual, "sensu")
|
||||
So(sensuNotifier.Url, ShouldEqual, "http://sensu-api.example.com:4567/results")
|
||||
So(sensuNotifier.Source, ShouldEqual, "grafana_instance_01")
|
||||
So(sensuNotifier.Handler, ShouldEqual, "myhandler")
|
||||
})
|
||||
})
|
||||
})
|
||||
|
@ -1,10 +1,10 @@
|
||||
package notifiers
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@ -15,6 +15,8 @@ import (
|
||||
// AlertStateCritical - Victorops uses "CRITICAL" string to indicate "Alerting" state
|
||||
const AlertStateCritical = "CRITICAL"
|
||||
|
||||
const AlertStateRecovery = "RECOVERY"
|
||||
|
||||
func init() {
|
||||
alerting.RegisterNotifier(&alerting.NotifierPlugin{
|
||||
Type: "victorops",
|
||||
@ -27,6 +29,15 @@ func init() {
|
||||
<span class="gf-form-label width-6">Url</span>
|
||||
<input type="text" required class="gf-form-input max-width-30" ng-model="ctrl.model.settings.url" placeholder="VictorOps url"></input>
|
||||
</div>
|
||||
<div class="gf-form">
|
||||
<gf-form-switch
|
||||
class="gf-form"
|
||||
label="Auto resolve incidents"
|
||||
label-class="width-14"
|
||||
checked="ctrl.model.settings.autoResolve"
|
||||
tooltip="Resolve incidents in VictorOps once the alert goes back to ok.">
|
||||
</gf-form-switch>
|
||||
</div>
|
||||
`,
|
||||
})
|
||||
}
|
||||
@ -34,6 +45,7 @@ func init() {
|
||||
// NewVictoropsNotifier creates an instance of VictoropsNotifier that
|
||||
// handles posting notifications to Victorops REST API
|
||||
func NewVictoropsNotifier(model *models.AlertNotification) (alerting.Notifier, error) {
|
||||
autoResolve := model.Settings.Get("autoResolve").MustBool(true)
|
||||
url := model.Settings.Get("url").MustString()
|
||||
if url == "" {
|
||||
return nil, alerting.ValidationError{Reason: "Could not find victorops url property in settings"}
|
||||
@ -42,6 +54,7 @@ func NewVictoropsNotifier(model *models.AlertNotification) (alerting.Notifier, e
|
||||
return &VictoropsNotifier{
|
||||
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
|
||||
URL: url,
|
||||
AutoResolve: autoResolve,
|
||||
log: log.New("alerting.notifier.victorops"),
|
||||
}, nil
|
||||
}
|
||||
@ -51,8 +64,9 @@ func NewVictoropsNotifier(model *models.AlertNotification) (alerting.Notifier, e
|
||||
// Victorops specifications (http://victorops.force.com/knowledgebase/articles/Integration/Alert-Ingestion-API-Documentation/)
|
||||
type VictoropsNotifier struct {
|
||||
NotifierBase
|
||||
URL string
|
||||
log log.Logger
|
||||
URL string
|
||||
AutoResolve bool
|
||||
log log.Logger
|
||||
}
|
||||
|
||||
// Notify sends notification to Victorops via POST to URL endpoint
|
||||
@ -66,6 +80,11 @@ func (this *VictoropsNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
return err
|
||||
}
|
||||
|
||||
if evalContext.Rule.State == models.AlertStateOK && !this.AutoResolve {
|
||||
this.log.Info("Not alerting VictorOps", "state", evalContext.Rule.State, "auto resolve", this.AutoResolve)
|
||||
return nil
|
||||
}
|
||||
|
||||
fields := make([]map[string]interface{}, 0)
|
||||
fieldLimitCount := 4
|
||||
for index, evt := range evalContext.EvalMatches {
|
||||
@ -92,20 +111,28 @@ func (this *VictoropsNotifier) Notify(evalContext *alerting.EvalContext) error {
|
||||
messageType = AlertStateCritical
|
||||
}
|
||||
|
||||
body := map[string]interface{}{
|
||||
"message_type": messageType,
|
||||
"entity_id": evalContext.Rule.Name,
|
||||
"timestamp": time.Now().Unix(),
|
||||
"state_start_time": evalContext.StartTime.Unix(),
|
||||
"state_message": evalContext.Rule.Message + "\n" + ruleUrl,
|
||||
"monitoring_tool": "Grafana v" + setting.BuildVersion,
|
||||
if evalContext.Rule.State == models.AlertStateOK {
|
||||
messageType = AlertStateRecovery
|
||||
}
|
||||
|
||||
data, _ := json.Marshal(&body)
|
||||
bodyJSON := simplejson.New()
|
||||
bodyJSON.Set("message_type", messageType)
|
||||
bodyJSON.Set("entity_id", evalContext.Rule.Name)
|
||||
bodyJSON.Set("timestamp", time.Now().Unix())
|
||||
bodyJSON.Set("state_start_time", evalContext.StartTime.Unix())
|
||||
bodyJSON.Set("state_message", evalContext.Rule.Message)
|
||||
bodyJSON.Set("monitoring_tool", "Grafana v"+setting.BuildVersion)
|
||||
bodyJSON.Set("alert_url", ruleUrl)
|
||||
|
||||
if evalContext.ImagePublicUrl != "" {
|
||||
bodyJSON.Set("image_url", evalContext.ImagePublicUrl)
|
||||
}
|
||||
|
||||
data, _ := bodyJSON.MarshalJSON()
|
||||
cmd := &models.SendWebhookSync{Url: this.URL, Body: string(data)}
|
||||
|
||||
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {
|
||||
this.log.Error("Failed to send victorops notification", "error", err, "webhook", this.Name)
|
||||
this.log.Error("Failed to send Victorops notification", "error", err, "webhook", this.Name)
|
||||
return err
|
||||
}
|
||||
|
||||
|
@ -31,17 +31,15 @@ func (handler *DefaultResultHandler) Handle(evalContext *EvalContext) error {
|
||||
executionError := ""
|
||||
annotationData := simplejson.New()
|
||||
|
||||
if evalContext.Firing {
|
||||
annotationData = simplejson.NewFromAny(evalContext.EvalMatches)
|
||||
if len(evalContext.EvalMatches) > 0 {
|
||||
annotationData.Set("evalMatches", simplejson.NewFromAny(evalContext.EvalMatches))
|
||||
}
|
||||
|
||||
if evalContext.Error != nil {
|
||||
executionError = evalContext.Error.Error()
|
||||
annotationData.Set("errorMessage", executionError)
|
||||
}
|
||||
|
||||
if evalContext.NoDataFound {
|
||||
annotationData.Set("no_data", true)
|
||||
annotationData.Set("error", executionError)
|
||||
} else if evalContext.NoDataFound {
|
||||
annotationData.Set("noData", true)
|
||||
}
|
||||
|
||||
countStateResult(evalContext.Rule.State)
|
||||
|
@ -2,6 +2,7 @@ package alerting
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/null"
|
||||
@ -56,7 +57,7 @@ func createTestEvalContext(cmd *NotificationTestCommand) *EvalContext {
|
||||
}
|
||||
ctx.IsTestRun = true
|
||||
ctx.Firing = true
|
||||
ctx.Error = nil
|
||||
ctx.Error = fmt.Errorf("This is only a test")
|
||||
ctx.EvalMatches = evalMatchesBasedOnState()
|
||||
|
||||
return ctx
|
||||
|
@ -6,7 +6,6 @@ import (
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -48,7 +47,7 @@ func GetAllAlertQueryHandler(query *m.GetAllAlertsQuery) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
func deleteAlertByIdInternal(alertId int64, reason string, sess *xorm.Session) error {
|
||||
func deleteAlertByIdInternal(alertId int64, reason string, sess *DBSession) error {
|
||||
sqlog.Debug("Deleting alert", "id", alertId, "reason", reason)
|
||||
|
||||
if _, err := sess.Exec("DELETE FROM alert WHERE id = ?", alertId); err != nil {
|
||||
@ -63,7 +62,7 @@ func deleteAlertByIdInternal(alertId int64, reason string, sess *xorm.Session) e
|
||||
}
|
||||
|
||||
func DeleteAlertById(cmd *m.DeleteAlertCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
return deleteAlertByIdInternal(cmd.AlertId, "DeleteAlertCommand", sess)
|
||||
})
|
||||
}
|
||||
@ -123,7 +122,7 @@ func HandleAlertsQuery(query *m.GetAlertsQuery) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
func DeleteAlertDefinition(dashboardId int64, sess *xorm.Session) error {
|
||||
func DeleteAlertDefinition(dashboardId int64, sess *DBSession) error {
|
||||
alerts := make([]*m.Alert, 0)
|
||||
sess.Where("dashboard_id = ?", dashboardId).Find(&alerts)
|
||||
|
||||
@ -135,7 +134,7 @@ func DeleteAlertDefinition(dashboardId int64, sess *xorm.Session) error {
|
||||
}
|
||||
|
||||
func SaveAlerts(cmd *m.SaveAlertsCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
existingAlerts, err := GetAlertsByDashboardId2(cmd.DashboardId, sess)
|
||||
if err != nil {
|
||||
return err
|
||||
@ -153,7 +152,7 @@ func SaveAlerts(cmd *m.SaveAlertsCommand) error {
|
||||
})
|
||||
}
|
||||
|
||||
func upsertAlerts(existingAlerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *xorm.Session) error {
|
||||
func upsertAlerts(existingAlerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *DBSession) error {
|
||||
for _, alert := range cmd.Alerts {
|
||||
update := false
|
||||
var alertToUpdate *m.Alert
|
||||
@ -197,7 +196,7 @@ func upsertAlerts(existingAlerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *xor
|
||||
return nil
|
||||
}
|
||||
|
||||
func deleteMissingAlerts(alerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *xorm.Session) error {
|
||||
func deleteMissingAlerts(alerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *DBSession) error {
|
||||
for _, missingAlert := range alerts {
|
||||
missing := true
|
||||
|
||||
@ -216,7 +215,7 @@ func deleteMissingAlerts(alerts []*m.Alert, cmd *m.SaveAlertsCommand, sess *xorm
|
||||
return nil
|
||||
}
|
||||
|
||||
func GetAlertsByDashboardId2(dashboardId int64, sess *xorm.Session) ([]*m.Alert, error) {
|
||||
func GetAlertsByDashboardId2(dashboardId int64, sess *DBSession) ([]*m.Alert, error) {
|
||||
alerts := make([]*m.Alert, 0)
|
||||
err := sess.Where("dashboard_id = ?", dashboardId).Find(&alerts)
|
||||
|
||||
@ -228,7 +227,7 @@ func GetAlertsByDashboardId2(dashboardId int64, sess *xorm.Session) ([]*m.Alert,
|
||||
}
|
||||
|
||||
func SetAlertState(cmd *m.SetAlertStateCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
alert := m.Alert{}
|
||||
|
||||
if has, err := sess.Id(cmd.AlertId).Get(&alert); err != nil {
|
||||
@ -262,7 +261,7 @@ func SetAlertState(cmd *m.SetAlertStateCommand) error {
|
||||
}
|
||||
|
||||
func PauseAlert(cmd *m.PauseAlertCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
if len(cmd.AlertIds) == 0 {
|
||||
return fmt.Errorf("command contains no alertids")
|
||||
}
|
||||
@ -292,7 +291,7 @@ func PauseAlert(cmd *m.PauseAlertCommand) error {
|
||||
}
|
||||
|
||||
func PauseAllAlerts(cmd *m.PauseAllAlertCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var newState string
|
||||
if cmd.Paused {
|
||||
newState = string(m.AlertStatePaused)
|
||||
|
@ -6,7 +6,6 @@ import (
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -21,7 +20,7 @@ func init() {
|
||||
}
|
||||
|
||||
func DeleteAlertNotification(cmd *m.DeleteAlertNotificationCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
sql := "DELETE FROM alert_notification WHERE alert_notification.org_id = ? AND alert_notification.id = ?"
|
||||
_, err := sess.Exec(sql, cmd.OrgId, cmd.Id)
|
||||
|
||||
@ -34,7 +33,7 @@ func DeleteAlertNotification(cmd *m.DeleteAlertNotificationCommand) error {
|
||||
}
|
||||
|
||||
func GetAlertNotifications(query *m.GetAlertNotificationsQuery) error {
|
||||
return getAlertNotificationInternal(query, x.NewSession())
|
||||
return getAlertNotificationInternal(query, newSession())
|
||||
}
|
||||
|
||||
func GetAllAlertNotifications(query *m.GetAllAlertNotificationsQuery) error {
|
||||
@ -85,7 +84,7 @@ func GetAlertNotificationsToSend(query *m.GetAlertNotificationsToSendQuery) erro
|
||||
return nil
|
||||
}
|
||||
|
||||
func getAlertNotificationInternal(query *m.GetAlertNotificationsQuery, sess *xorm.Session) error {
|
||||
func getAlertNotificationInternal(query *m.GetAlertNotificationsQuery, sess *DBSession) error {
|
||||
var sql bytes.Buffer
|
||||
params := make([]interface{}, 0)
|
||||
|
||||
@ -131,7 +130,7 @@ func getAlertNotificationInternal(query *m.GetAlertNotificationsQuery, sess *xor
|
||||
}
|
||||
|
||||
func CreateAlertNotificationCommand(cmd *m.CreateAlertNotificationCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
existingQuery := &m.GetAlertNotificationsQuery{OrgId: cmd.OrgId, Name: cmd.Name}
|
||||
err := getAlertNotificationInternal(existingQuery, sess)
|
||||
|
||||
@ -163,7 +162,7 @@ func CreateAlertNotificationCommand(cmd *m.CreateAlertNotificationCommand) error
|
||||
}
|
||||
|
||||
func UpdateAlertNotification(cmd *m.UpdateAlertNotificationCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) (err error) {
|
||||
return inTransaction(func(sess *DBSession) (err error) {
|
||||
current := m.AlertNotification{}
|
||||
|
||||
if _, err = sess.Id(cmd.Id).Get(¤t); err != nil {
|
||||
|
@ -5,7 +5,6 @@ import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/services/annotations"
|
||||
)
|
||||
|
||||
@ -13,7 +12,7 @@ type SqlAnnotationRepo struct {
|
||||
}
|
||||
|
||||
func (r *SqlAnnotationRepo) Save(item *annotations.Item) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
if _, err := sess.Table("annotation").Insert(item); err != nil {
|
||||
return err
|
||||
@ -24,7 +23,7 @@ func (r *SqlAnnotationRepo) Save(item *annotations.Item) error {
|
||||
}
|
||||
|
||||
func (r *SqlAnnotationRepo) Update(item *annotations.Item) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
if _, err := sess.Table("annotation").Id(item.Id).Update(item); err != nil {
|
||||
return err
|
||||
@ -97,7 +96,7 @@ func (r *SqlAnnotationRepo) Find(query *annotations.ItemQuery) ([]*annotations.I
|
||||
}
|
||||
|
||||
func (r *SqlAnnotationRepo) Delete(params *annotations.DeleteParams) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
sql := "DELETE FROM annotation WHERE dashboard_id = ? AND panel_id = ?"
|
||||
|
||||
|
@ -3,7 +3,6 @@ package sqlstore
|
||||
import (
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -24,7 +23,7 @@ func GetApiKeys(query *m.GetApiKeysQuery) error {
|
||||
}
|
||||
|
||||
func DeleteApiKey(cmd *m.DeleteApiKeyCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM api_key WHERE id=? and org_id=?"
|
||||
_, err := sess.Exec(rawSql, cmd.Id, cmd.OrgId)
|
||||
return err
|
||||
@ -32,7 +31,7 @@ func DeleteApiKey(cmd *m.DeleteApiKeyCommand) error {
|
||||
}
|
||||
|
||||
func AddApiKey(cmd *m.AddApiKeyCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
t := m.ApiKey{
|
||||
OrgId: cmd.OrgId,
|
||||
Name: cmd.Name,
|
||||
|
@ -3,8 +3,8 @@ package sqlstore
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
@ -23,7 +23,7 @@ func init() {
|
||||
}
|
||||
|
||||
func SaveDashboard(cmd *m.SaveDashboardCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
dash := cmd.GetDashboardModel()
|
||||
|
||||
// try get existing dashboard
|
||||
@ -63,16 +63,20 @@ func SaveDashboard(cmd *m.SaveDashboardCommand) error {
|
||||
if dash.Id != sameTitle.Id {
|
||||
if cmd.Overwrite {
|
||||
dash.Id = sameTitle.Id
|
||||
dash.Version = sameTitle.Version
|
||||
} else {
|
||||
return m.ErrDashboardWithSameNameExists
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
parentVersion := dash.Version
|
||||
affectedRows := int64(0)
|
||||
|
||||
if dash.Id == 0 {
|
||||
dash.Version = 1
|
||||
metrics.M_Models_Dashboard_Insert.Inc(1)
|
||||
dash.Data.Set("version", dash.Version)
|
||||
affectedRows, err = sess.Insert(dash)
|
||||
} else {
|
||||
dash.Version += 1
|
||||
@ -80,10 +84,32 @@ func SaveDashboard(cmd *m.SaveDashboardCommand) error {
|
||||
affectedRows, err = sess.Id(dash.Id).Update(dash)
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if affectedRows == 0 {
|
||||
return m.ErrDashboardNotFound
|
||||
}
|
||||
|
||||
dashVersion := &m.DashboardVersion{
|
||||
DashboardId: dash.Id,
|
||||
ParentVersion: parentVersion,
|
||||
RestoredFrom: cmd.RestoredFrom,
|
||||
Version: dash.Version,
|
||||
Created: time.Now(),
|
||||
CreatedBy: dash.UpdatedBy,
|
||||
Message: cmd.Message,
|
||||
Data: dash.Data,
|
||||
}
|
||||
|
||||
// insert version entry
|
||||
if affectedRows, err = sess.Insert(dashVersion); err != nil {
|
||||
return err
|
||||
} else if affectedRows == 0 {
|
||||
return m.ErrDashboardNotFound
|
||||
}
|
||||
|
||||
// delete existing tabs
|
||||
_, err = sess.Exec("DELETE FROM dashboard_tag WHERE dashboard_id=?", dash.Id)
|
||||
if err != nil {
|
||||
@ -107,8 +133,9 @@ func SaveDashboard(cmd *m.SaveDashboardCommand) error {
|
||||
}
|
||||
|
||||
func GetDashboard(query *m.GetDashboardQuery) error {
|
||||
dashboard := m.Dashboard{Slug: query.Slug, OrgId: query.OrgId}
|
||||
dashboard := m.Dashboard{Slug: query.Slug, OrgId: query.OrgId, Id: query.Id}
|
||||
has, err := x.Get(&dashboard)
|
||||
|
||||
if err != nil {
|
||||
return err
|
||||
} else if has == false {
|
||||
@ -117,7 +144,6 @@ func GetDashboard(query *m.GetDashboardQuery) error {
|
||||
|
||||
dashboard.Data.Set("id", dashboard.Id)
|
||||
query.Result = &dashboard
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -220,7 +246,7 @@ func GetDashboardTags(query *m.GetDashboardTagsQuery) error {
|
||||
}
|
||||
|
||||
func DeleteDashboard(cmd *m.DeleteDashboardCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
dashboard := m.Dashboard{Slug: cmd.Slug, OrgId: cmd.OrgId}
|
||||
has, err := sess.Get(&dashboard)
|
||||
if err != nil {
|
||||
@ -234,6 +260,7 @@ func DeleteDashboard(cmd *m.DeleteDashboardCommand) error {
|
||||
"DELETE FROM star WHERE dashboard_id = ? ",
|
||||
"DELETE FROM dashboard WHERE id = ?",
|
||||
"DELETE FROM playlist_item WHERE type = 'dashboard_by_id' AND value = ?",
|
||||
"DELETE FROM dashboard_version WHERE dashboard_id = ?",
|
||||
}
|
||||
|
||||
for _, sql := range deletes {
|
||||
@ -243,7 +270,7 @@ func DeleteDashboard(cmd *m.DeleteDashboardCommand) error {
|
||||
}
|
||||
}
|
||||
|
||||
if err := DeleteAlertDefinition(dashboard.Id, sess.Session); err != nil {
|
||||
if err := DeleteAlertDefinition(dashboard.Id, sess); err != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
|
@ -3,7 +3,6 @@ package sqlstore
|
||||
import (
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
@ -18,7 +17,7 @@ func init() {
|
||||
}
|
||||
|
||||
func DeleteExpiredSnapshots(cmd *m.DeleteExpiredSnapshotsCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var expiredCount int64 = 0
|
||||
|
||||
if setting.SnapShotRemoveExpired {
|
||||
@ -36,7 +35,7 @@ func DeleteExpiredSnapshots(cmd *m.DeleteExpiredSnapshotsCommand) error {
|
||||
}
|
||||
|
||||
func CreateDashboardSnapshot(cmd *m.CreateDashboardSnapshotCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
// never
|
||||
var expires = time.Now().Add(time.Hour * 24 * 365 * 50)
|
||||
@ -65,7 +64,7 @@ func CreateDashboardSnapshot(cmd *m.CreateDashboardSnapshotCommand) error {
|
||||
}
|
||||
|
||||
func DeleteDashboardSnapshot(cmd *m.DeleteDashboardSnapshotCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM dashboard_snapshot WHERE delete_key=?"
|
||||
_, err := sess.Exec(rawSql, cmd.DeleteKey)
|
||||
return err
|
||||
|
60
pkg/services/sqlstore/dashboard_version.go
Normal file
60
pkg/services/sqlstore/dashboard_version.go
Normal file
@ -0,0 +1,60 @@
|
||||
package sqlstore
|
||||
|
||||
import (
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
func init() {
|
||||
bus.AddHandler("sql", GetDashboardVersion)
|
||||
bus.AddHandler("sql", GetDashboardVersions)
|
||||
}
|
||||
|
||||
// GetDashboardVersion gets the dashboard version for the given dashboard ID and version number.
|
||||
func GetDashboardVersion(query *m.GetDashboardVersionQuery) error {
|
||||
version := m.DashboardVersion{}
|
||||
has, err := x.Where("dashboard_version.dashboard_id=? AND dashboard_version.version=? AND dashboard.org_id=?", query.DashboardId, query.Version, query.OrgId).
|
||||
Join("LEFT", "dashboard", `dashboard.id = dashboard_version.dashboard_id`).
|
||||
Get(&version)
|
||||
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if !has {
|
||||
return m.ErrDashboardVersionNotFound
|
||||
}
|
||||
|
||||
version.Data.Set("id", version.DashboardId)
|
||||
query.Result = &version
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetDashboardVersions gets all dashboard versions for the given dashboard ID.
|
||||
func GetDashboardVersions(query *m.GetDashboardVersionsQuery) error {
|
||||
err := x.Table("dashboard_version").
|
||||
Select(`dashboard_version.id,
|
||||
dashboard_version.dashboard_id,
|
||||
dashboard_version.parent_version,
|
||||
dashboard_version.restored_from,
|
||||
dashboard_version.version,
|
||||
dashboard_version.created,
|
||||
dashboard_version.created_by as created_by_id,
|
||||
dashboard_version.message,
|
||||
dashboard_version.data,`+
|
||||
dialect.Quote("user")+`.login as created_by`).
|
||||
Join("LEFT", "user", `dashboard_version.created_by = `+dialect.Quote("user")+`.id`).
|
||||
Join("LEFT", "dashboard", `dashboard.id = dashboard_version.dashboard_id`).
|
||||
Where("dashboard_version.dashboard_id=? AND dashboard.org_id=?", query.DashboardId, query.OrgId).
|
||||
OrderBy("dashboard_version.version DESC").
|
||||
Limit(query.Limit, query.Start).
|
||||
Find(&query.Result)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if len(query.Result) < 1 {
|
||||
return m.ErrNoVersionsForDashboardId
|
||||
}
|
||||
return nil
|
||||
}
|
103
pkg/services/sqlstore/dashboard_version_test.go
Normal file
103
pkg/services/sqlstore/dashboard_version_test.go
Normal file
@ -0,0 +1,103 @@
|
||||
package sqlstore
|
||||
|
||||
import (
|
||||
"reflect"
|
||||
"testing"
|
||||
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
func updateTestDashboard(dashboard *m.Dashboard, data map[string]interface{}) {
|
||||
data["title"] = dashboard.Title
|
||||
|
||||
saveCmd := m.SaveDashboardCommand{
|
||||
OrgId: dashboard.OrgId,
|
||||
Overwrite: true,
|
||||
Dashboard: simplejson.NewFromAny(data),
|
||||
}
|
||||
|
||||
err := SaveDashboard(&saveCmd)
|
||||
So(err, ShouldBeNil)
|
||||
}
|
||||
|
||||
func TestGetDashboardVersion(t *testing.T) {
|
||||
Convey("Testing dashboard version retrieval", t, func() {
|
||||
InitTestDB(t)
|
||||
|
||||
Convey("Get a Dashboard ID and version ID", func() {
|
||||
savedDash := insertTestDashboard("test dash 26", 1, "diff")
|
||||
|
||||
query := m.GetDashboardVersionQuery{
|
||||
DashboardId: savedDash.Id,
|
||||
Version: savedDash.Version,
|
||||
OrgId: 1,
|
||||
}
|
||||
|
||||
err := GetDashboardVersion(&query)
|
||||
So(err, ShouldBeNil)
|
||||
So(savedDash.Id, ShouldEqual, query.DashboardId)
|
||||
So(savedDash.Version, ShouldEqual, query.Version)
|
||||
|
||||
dashCmd := m.GetDashboardQuery{
|
||||
OrgId: savedDash.OrgId,
|
||||
Slug: savedDash.Slug,
|
||||
}
|
||||
|
||||
err = GetDashboard(&dashCmd)
|
||||
So(err, ShouldBeNil)
|
||||
eq := reflect.DeepEqual(dashCmd.Result.Data, query.Result.Data)
|
||||
So(eq, ShouldEqual, true)
|
||||
})
|
||||
|
||||
Convey("Attempt to get a version that doesn't exist", func() {
|
||||
query := m.GetDashboardVersionQuery{
|
||||
DashboardId: int64(999),
|
||||
Version: 123,
|
||||
OrgId: 1,
|
||||
}
|
||||
|
||||
err := GetDashboardVersion(&query)
|
||||
So(err, ShouldNotBeNil)
|
||||
So(err, ShouldEqual, m.ErrDashboardVersionNotFound)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func TestGetDashboardVersions(t *testing.T) {
|
||||
Convey("Testing dashboard versions retrieval", t, func() {
|
||||
InitTestDB(t)
|
||||
savedDash := insertTestDashboard("test dash 43", 1, "diff-all")
|
||||
|
||||
Convey("Get all versions for a given Dashboard ID", func() {
|
||||
query := m.GetDashboardVersionsQuery{DashboardId: savedDash.Id, OrgId: 1}
|
||||
|
||||
err := GetDashboardVersions(&query)
|
||||
So(err, ShouldBeNil)
|
||||
So(len(query.Result), ShouldEqual, 1)
|
||||
})
|
||||
|
||||
Convey("Attempt to get the versions for a non-existent Dashboard ID", func() {
|
||||
query := m.GetDashboardVersionsQuery{DashboardId: int64(999), OrgId: 1}
|
||||
|
||||
err := GetDashboardVersions(&query)
|
||||
So(err, ShouldNotBeNil)
|
||||
So(err, ShouldEqual, m.ErrNoVersionsForDashboardId)
|
||||
So(len(query.Result), ShouldEqual, 0)
|
||||
})
|
||||
|
||||
Convey("Get all versions for an updated dashboard", func() {
|
||||
updateTestDashboard(savedDash, map[string]interface{}{
|
||||
"tags": "different-tag",
|
||||
})
|
||||
|
||||
query := m.GetDashboardVersionsQuery{DashboardId: savedDash.Id, OrgId: 1}
|
||||
err := GetDashboardVersions(&query)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(len(query.Result), ShouldEqual, 2)
|
||||
})
|
||||
})
|
||||
}
|
@ -6,8 +6,6 @@ import (
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
"github.com/grafana/grafana/pkg/components/securejsondata"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
)
|
||||
|
||||
func init() {
|
||||
@ -52,7 +50,7 @@ func GetDataSources(query *m.GetDataSourcesQuery) error {
|
||||
}
|
||||
|
||||
func DeleteDataSourceById(cmd *m.DeleteDataSourceByIdCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM data_source WHERE id=? and org_id=?"
|
||||
_, err := sess.Exec(rawSql, cmd.Id, cmd.OrgId)
|
||||
return err
|
||||
@ -60,7 +58,7 @@ func DeleteDataSourceById(cmd *m.DeleteDataSourceByIdCommand) error {
|
||||
}
|
||||
|
||||
func DeleteDataSourceByName(cmd *m.DeleteDataSourceByNameCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM data_source WHERE name=? and org_id=?"
|
||||
_, err := sess.Exec(rawSql, cmd.Name, cmd.OrgId)
|
||||
return err
|
||||
@ -69,7 +67,7 @@ func DeleteDataSourceByName(cmd *m.DeleteDataSourceByNameCommand) error {
|
||||
|
||||
func AddDataSource(cmd *m.AddDataSourceCommand) error {
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
existing := m.DataSource{OrgId: cmd.OrgId, Name: cmd.Name}
|
||||
has, _ := sess.Get(&existing)
|
||||
|
||||
@ -109,7 +107,7 @@ func AddDataSource(cmd *m.AddDataSourceCommand) error {
|
||||
})
|
||||
}
|
||||
|
||||
func updateIsDefaultFlag(ds *m.DataSource, sess *xorm.Session) error {
|
||||
func updateIsDefaultFlag(ds *m.DataSource, sess *DBSession) error {
|
||||
// Handle is default flag
|
||||
if ds.IsDefault {
|
||||
rawSql := "UPDATE data_source SET is_default=? WHERE org_id=? AND id <> ?"
|
||||
@ -122,7 +120,7 @@ func updateIsDefaultFlag(ds *m.DataSource, sess *xorm.Session) error {
|
||||
|
||||
func UpdateDataSource(cmd *m.UpdateDataSourceCommand) error {
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
ds := &m.DataSource{
|
||||
Id: cmd.Id,
|
||||
OrgId: cmd.OrgId,
|
||||
|
@ -23,67 +23,59 @@ func NewXormLogger(level glog.Lvl, grafanaLog glog.Logger) *XormLogger {
|
||||
}
|
||||
|
||||
// Error implement core.ILogger
|
||||
func (s *XormLogger) Err(v ...interface{}) error {
|
||||
func (s *XormLogger) Error(v ...interface{}) {
|
||||
if s.level <= glog.LvlError {
|
||||
s.grafanaLog.Error(fmt.Sprint(v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Errorf implement core.ILogger
|
||||
func (s *XormLogger) Errf(format string, v ...interface{}) error {
|
||||
func (s *XormLogger) Errorf(format string, v ...interface{}) {
|
||||
if s.level <= glog.LvlError {
|
||||
s.grafanaLog.Error(fmt.Sprintf(format, v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Debug implement core.ILogger
|
||||
func (s *XormLogger) Debug(v ...interface{}) error {
|
||||
func (s *XormLogger) Debug(v ...interface{}) {
|
||||
if s.level <= glog.LvlDebug {
|
||||
s.grafanaLog.Debug(fmt.Sprint(v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Debugf implement core.ILogger
|
||||
func (s *XormLogger) Debugf(format string, v ...interface{}) error {
|
||||
func (s *XormLogger) Debugf(format string, v ...interface{}) {
|
||||
if s.level <= glog.LvlDebug {
|
||||
s.grafanaLog.Debug(fmt.Sprintf(format, v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Info implement core.ILogger
|
||||
func (s *XormLogger) Info(v ...interface{}) error {
|
||||
func (s *XormLogger) Info(v ...interface{}) {
|
||||
if s.level <= glog.LvlInfo {
|
||||
s.grafanaLog.Info(fmt.Sprint(v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Infof implement core.ILogger
|
||||
func (s *XormLogger) Infof(format string, v ...interface{}) error {
|
||||
func (s *XormLogger) Infof(format string, v ...interface{}) {
|
||||
if s.level <= glog.LvlInfo {
|
||||
s.grafanaLog.Info(fmt.Sprintf(format, v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Warn implement core.ILogger
|
||||
func (s *XormLogger) Warning(v ...interface{}) error {
|
||||
func (s *XormLogger) Warn(v ...interface{}) {
|
||||
if s.level <= glog.LvlWarn {
|
||||
s.grafanaLog.Warn(fmt.Sprint(v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Warnf implement core.ILogger
|
||||
func (s *XormLogger) Warningf(format string, v ...interface{}) error {
|
||||
func (s *XormLogger) Warnf(format string, v ...interface{}) {
|
||||
if s.level <= glog.LvlWarn {
|
||||
s.grafanaLog.Warn(fmt.Sprintf(format, v...))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Level implement core.ILogger
|
||||
@ -103,8 +95,7 @@ func (s *XormLogger) Level() core.LogLevel {
|
||||
}
|
||||
|
||||
// SetLevel implement core.ILogger
|
||||
func (s *XormLogger) SetLevel(l core.LogLevel) error {
|
||||
return nil
|
||||
func (s *XormLogger) SetLevel(l core.LogLevel) {
|
||||
}
|
||||
|
||||
// ShowSQL implement core.ILogger
|
||||
|
@ -16,7 +16,7 @@ func addAlertMigrations(mg *Migrator) {
|
||||
{Name: "org_id", Type: DB_BigInt, Nullable: false},
|
||||
{Name: "name", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "message", Type: DB_Text, Nullable: false},
|
||||
{Name: "state", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "state", Type: DB_NVarchar, Length: 190, Nullable: false},
|
||||
{Name: "settings", Type: DB_Text, Nullable: false},
|
||||
{Name: "frequency", Type: DB_BigInt, Nullable: false},
|
||||
{Name: "handler", Type: DB_BigInt, Nullable: false},
|
||||
@ -70,7 +70,7 @@ func addAlertMigrations(mg *Migrator) {
|
||||
mg.AddMigration("Update alert table charset", NewTableCharsetMigration("alert", []*Column{
|
||||
{Name: "name", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "message", Type: DB_Text, Nullable: false},
|
||||
{Name: "state", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "state", Type: DB_NVarchar, Length: 190, Nullable: false},
|
||||
{Name: "settings", Type: DB_Text, Nullable: false},
|
||||
{Name: "severity", Type: DB_Text, Nullable: false},
|
||||
{Name: "execution_error", Type: DB_Text, Nullable: false},
|
||||
|
@ -8,7 +8,7 @@ func addDashboardMigration(mg *Migrator) {
|
||||
Columns: []*Column{
|
||||
{Name: "id", Type: DB_BigInt, IsPrimaryKey: true, IsAutoIncrement: true},
|
||||
{Name: "version", Type: DB_Int, Nullable: false},
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 190, Nullable: false},
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 189, Nullable: false},
|
||||
{Name: "title", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "data", Type: DB_Text, Nullable: false},
|
||||
{Name: "account_id", Type: DB_BigInt, Nullable: false},
|
||||
@ -56,7 +56,7 @@ func addDashboardMigration(mg *Migrator) {
|
||||
Columns: []*Column{
|
||||
{Name: "id", Type: DB_BigInt, IsPrimaryKey: true, IsAutoIncrement: true},
|
||||
{Name: "version", Type: DB_Int, Nullable: false},
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 190, Nullable: false},
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 189, Nullable: false},
|
||||
{Name: "title", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "data", Type: DB_Text, Nullable: false},
|
||||
{Name: "org_id", Type: DB_BigInt, Nullable: false},
|
||||
@ -114,7 +114,7 @@ func addDashboardMigration(mg *Migrator) {
|
||||
|
||||
// add column to store plugin_id
|
||||
mg.AddMigration("Add column plugin_id in dashboard", NewAddColumnMigration(dashboardV2, &Column{
|
||||
Name: "plugin_id", Type: DB_NVarchar, Nullable: true, Length: 255,
|
||||
Name: "plugin_id", Type: DB_NVarchar, Nullable: true, Length: 189,
|
||||
}))
|
||||
|
||||
mg.AddMigration("Add index for plugin_id in dashboard", NewAddIndexMigration(dashboardV2, &Index{
|
||||
@ -127,9 +127,9 @@ func addDashboardMigration(mg *Migrator) {
|
||||
}))
|
||||
|
||||
mg.AddMigration("Update dashboard table charset", NewTableCharsetMigration("dashboard", []*Column{
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 190, Nullable: false},
|
||||
{Name: "slug", Type: DB_NVarchar, Length: 189, Nullable: false},
|
||||
{Name: "title", Type: DB_NVarchar, Length: 255, Nullable: false},
|
||||
{Name: "plugin_id", Type: DB_NVarchar, Nullable: true, Length: 255},
|
||||
{Name: "plugin_id", Type: DB_NVarchar, Nullable: true, Length: 189},
|
||||
{Name: "data", Type: DB_MediumText, Nullable: false},
|
||||
}))
|
||||
|
||||
|
61
pkg/services/sqlstore/migrations/dashboard_version_mig.go
Normal file
61
pkg/services/sqlstore/migrations/dashboard_version_mig.go
Normal file
@ -0,0 +1,61 @@
|
||||
package migrations
|
||||
|
||||
import . "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
|
||||
|
||||
func addDashboardVersionMigration(mg *Migrator) {
|
||||
dashboardVersionV1 := Table{
|
||||
Name: "dashboard_version",
|
||||
Columns: []*Column{
|
||||
{Name: "id", Type: DB_BigInt, IsPrimaryKey: true, IsAutoIncrement: true},
|
||||
{Name: "dashboard_id", Type: DB_BigInt},
|
||||
{Name: "parent_version", Type: DB_Int, Nullable: false},
|
||||
{Name: "restored_from", Type: DB_Int, Nullable: false},
|
||||
{Name: "version", Type: DB_Int, Nullable: false},
|
||||
{Name: "created", Type: DB_DateTime, Nullable: false},
|
||||
{Name: "created_by", Type: DB_BigInt, Nullable: false},
|
||||
{Name: "message", Type: DB_Text, Nullable: false},
|
||||
{Name: "data", Type: DB_Text, Nullable: false},
|
||||
},
|
||||
Indices: []*Index{
|
||||
{Cols: []string{"dashboard_id"}},
|
||||
{Cols: []string{"dashboard_id", "version"}, Type: UniqueIndex},
|
||||
},
|
||||
}
|
||||
|
||||
mg.AddMigration("create dashboard_version table v1", NewAddTableMigration(dashboardVersionV1))
|
||||
mg.AddMigration("add index dashboard_version.dashboard_id", NewAddIndexMigration(dashboardVersionV1, dashboardVersionV1.Indices[0]))
|
||||
mg.AddMigration("add unique index dashboard_version.dashboard_id and dashboard_version.version", NewAddIndexMigration(dashboardVersionV1, dashboardVersionV1.Indices[1]))
|
||||
|
||||
// before new dashboards where created with version 0, now they are always inserted with version 1
|
||||
const setVersionTo1WhereZeroSQL = `UPDATE dashboard SET version = 1 WHERE version = 0`
|
||||
mg.AddMigration("Set dashboard version to 1 where 0", new(RawSqlMigration).
|
||||
Sqlite(setVersionTo1WhereZeroSQL).
|
||||
Postgres(setVersionTo1WhereZeroSQL).
|
||||
Mysql(setVersionTo1WhereZeroSQL))
|
||||
|
||||
const rawSQL = `INSERT INTO dashboard_version
|
||||
(
|
||||
dashboard_id,
|
||||
version,
|
||||
parent_version,
|
||||
restored_from,
|
||||
created,
|
||||
created_by,
|
||||
message,
|
||||
data
|
||||
)
|
||||
SELECT
|
||||
dashboard.id,
|
||||
dashboard.version,
|
||||
dashboard.version,
|
||||
dashboard.version,
|
||||
dashboard.updated,
|
||||
dashboard.updated_by,
|
||||
'',
|
||||
dashboard.data
|
||||
FROM dashboard;`
|
||||
mg.AddMigration("save existing dashboard data in dashboard_version table v1", new(RawSqlMigration).
|
||||
Sqlite(rawSQL).
|
||||
Postgres(rawSQL).
|
||||
Mysql(rawSQL))
|
||||
}
|
@ -25,6 +25,7 @@ func AddMigrations(mg *Migrator) {
|
||||
addAlertMigrations(mg)
|
||||
addAnnotationMig(mg)
|
||||
addTestDataMigrations(mg)
|
||||
addDashboardVersionMigration(mg)
|
||||
}
|
||||
|
||||
func addMigrationLogMigrations(mg *Migrator) {
|
||||
|
@ -9,10 +9,10 @@ func addTempUserMigrations(mg *Migrator) {
|
||||
{Name: "id", Type: DB_BigInt, IsPrimaryKey: true, IsAutoIncrement: true},
|
||||
{Name: "org_id", Type: DB_BigInt, Nullable: false},
|
||||
{Name: "version", Type: DB_Int, Nullable: false},
|
||||
{Name: "email", Type: DB_NVarchar, Length: 255},
|
||||
{Name: "email", Type: DB_NVarchar, Length: 190},
|
||||
{Name: "name", Type: DB_NVarchar, Length: 255, Nullable: true},
|
||||
{Name: "role", Type: DB_NVarchar, Length: 20, Nullable: true},
|
||||
{Name: "code", Type: DB_NVarchar, Length: 255},
|
||||
{Name: "code", Type: DB_NVarchar, Length: 190},
|
||||
{Name: "status", Type: DB_Varchar, Length: 20},
|
||||
{Name: "invited_by_user_id", Type: DB_BigInt, Nullable: true},
|
||||
{Name: "email_sent", Type: DB_Bool},
|
||||
@ -37,10 +37,10 @@ func addTempUserMigrations(mg *Migrator) {
|
||||
addTableIndicesMigrations(mg, "v1-7", tempUserV1)
|
||||
|
||||
mg.AddMigration("Update temp_user table charset", NewTableCharsetMigration("temp_user", []*Column{
|
||||
{Name: "email", Type: DB_NVarchar, Length: 255},
|
||||
{Name: "email", Type: DB_NVarchar, Length: 190},
|
||||
{Name: "name", Type: DB_NVarchar, Length: 255, Nullable: true},
|
||||
{Name: "role", Type: DB_NVarchar, Length: 20, Nullable: true},
|
||||
{Name: "code", Type: DB_NVarchar, Length: 255},
|
||||
{Name: "code", Type: DB_NVarchar, Length: 190},
|
||||
{Name: "status", Type: DB_Varchar, Length: 20},
|
||||
{Name: "remote_addr", Type: DB_Varchar, Length: 255, Nullable: true},
|
||||
}))
|
||||
|
@ -63,7 +63,7 @@ func GetOrgByName(query *m.GetOrgByNameQuery) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
func isOrgNameTaken(name string, existingId int64, sess *session) (bool, error) {
|
||||
func isOrgNameTaken(name string, existingId int64, sess *DBSession) (bool, error) {
|
||||
// check if org name is taken
|
||||
var org m.Org
|
||||
exists, err := sess.Where("name=?", name).Get(&org)
|
||||
@ -80,7 +80,7 @@ func isOrgNameTaken(name string, existingId int64, sess *session) (bool, error)
|
||||
}
|
||||
|
||||
func CreateOrg(cmd *m.CreateOrgCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
if isNameTaken, err := isOrgNameTaken(cmd.Name, 0, sess); err != nil {
|
||||
return err
|
||||
@ -120,7 +120,7 @@ func CreateOrg(cmd *m.CreateOrgCommand) error {
|
||||
}
|
||||
|
||||
func UpdateOrg(cmd *m.UpdateOrgCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
if isNameTaken, err := isOrgNameTaken(cmd.Name, cmd.OrgId, sess); err != nil {
|
||||
return err
|
||||
@ -154,7 +154,7 @@ func UpdateOrg(cmd *m.UpdateOrgCommand) error {
|
||||
}
|
||||
|
||||
func UpdateOrgAddress(cmd *m.UpdateOrgAddressCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
org := m.Org{
|
||||
Address1: cmd.Address1,
|
||||
Address2: cmd.Address2,
|
||||
@ -181,7 +181,7 @@ func UpdateOrgAddress(cmd *m.UpdateOrgAddressCommand) error {
|
||||
}
|
||||
|
||||
func DeleteOrg(cmd *m.DeleteOrgCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
if res, err := sess.Query("SELECT 1 from org WHERE id=?", cmd.Id); err != nil {
|
||||
return err
|
||||
} else if len(res) != 1 {
|
||||
|
@ -4,8 +4,6 @@ import (
|
||||
"fmt"
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -18,7 +16,7 @@ func init() {
|
||||
}
|
||||
|
||||
func AddOrgUser(cmd *m.AddOrgUserCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
// check if user exists
|
||||
if res, err := sess.Query("SELECT 1 from org_user WHERE org_id=? and user_id=?", cmd.OrgId, cmd.UserId); err != nil {
|
||||
return err
|
||||
@ -46,7 +44,7 @@ func AddOrgUser(cmd *m.AddOrgUserCommand) error {
|
||||
}
|
||||
|
||||
func UpdateOrgUser(cmd *m.UpdateOrgUserCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var orgUser m.OrgUser
|
||||
exists, err := sess.Where("org_id=? AND user_id=?", cmd.OrgId, cmd.UserId).Get(&orgUser)
|
||||
if err != nil {
|
||||
@ -81,7 +79,7 @@ func GetOrgUsers(query *m.GetOrgUsersQuery) error {
|
||||
}
|
||||
|
||||
func RemoveOrgUser(cmd *m.RemoveOrgUserCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM org_user WHERE org_id=? and user_id=?"
|
||||
_, err := sess.Exec(rawSql, cmd.OrgId, cmd.UserId)
|
||||
if err != nil {
|
||||
@ -92,7 +90,7 @@ func RemoveOrgUser(cmd *m.RemoveOrgUserCommand) error {
|
||||
})
|
||||
}
|
||||
|
||||
func validateOneAdminLeftInOrg(orgId int64, sess *xorm.Session) error {
|
||||
func validateOneAdminLeftInOrg(orgId int64, sess *DBSession) error {
|
||||
// validate that there is an admin user left
|
||||
res, err := sess.Query("SELECT 1 from org_user WHERE org_id=? and role='Admin'", orgId)
|
||||
if err != nil {
|
||||
|
@ -3,8 +3,6 @@ package sqlstore
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -85,12 +83,12 @@ func UpdatePlaylist(cmd *m.UpdatePlaylistCommand) error {
|
||||
|
||||
playlistItems := make([]m.PlaylistItem, 0)
|
||||
|
||||
for _, item := range cmd.Items {
|
||||
for index, item := range cmd.Items {
|
||||
playlistItems = append(playlistItems, m.PlaylistItem{
|
||||
PlaylistId: playlist.Id,
|
||||
Type: item.Type,
|
||||
Value: item.Value,
|
||||
Order: item.Order,
|
||||
Order: index + 1,
|
||||
Title: item.Title,
|
||||
})
|
||||
}
|
||||
@ -118,7 +116,7 @@ func DeletePlaylist(cmd *m.DeletePlaylistCommand) error {
|
||||
return m.ErrCommandValidationFailed
|
||||
}
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawPlaylistSql = "DELETE FROM playlist WHERE id = ? and org_id = ?"
|
||||
_, err := sess.Exec(rawPlaylistSql, cmd.Id, cmd.OrgId)
|
||||
|
||||
|
@ -44,7 +44,7 @@ func GetPluginSettingById(query *m.GetPluginSettingByIdQuery) error {
|
||||
}
|
||||
|
||||
func UpdatePluginSetting(cmd *m.UpdatePluginSettingCmd) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var pluginSetting m.PluginSetting
|
||||
|
||||
exists, err := sess.Where("org_id=? and plugin_id=?", cmd.OrgId, cmd.PluginId).Get(&pluginSetting)
|
||||
@ -104,7 +104,7 @@ func UpdatePluginSetting(cmd *m.UpdatePluginSettingCmd) error {
|
||||
}
|
||||
|
||||
func UpdatePluginSettingVersion(cmd *m.UpdatePluginSettingVersionCmd) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
_, err := sess.Exec("UPDATE plugin_setting SET plugin_version=? WHERE org_id=? AND plugin_id=?", cmd.PluginVersion, cmd.OrgId, cmd.PluginId)
|
||||
return err
|
||||
|
@ -68,7 +68,7 @@ func GetPreferences(query *m.GetPreferencesQuery) error {
|
||||
}
|
||||
|
||||
func SavePreferences(cmd *m.SavePreferencesCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
var prefs m.Preferences
|
||||
exists, err := sess.Where("org_id=? AND user_id=?", cmd.OrgId, cmd.UserId).Get(&prefs)
|
||||
|
@ -2,6 +2,7 @@ package sqlstore
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
@ -94,7 +95,7 @@ func GetOrgQuotas(query *m.GetOrgQuotasQuery) error {
|
||||
}
|
||||
|
||||
func UpdateOrgQuota(cmd *m.UpdateOrgQuotaCmd) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
//Check if quota is already defined in the DB
|
||||
quota := m.Quota{
|
||||
Target: cmd.Target,
|
||||
@ -194,7 +195,7 @@ func GetUserQuotas(query *m.GetUserQuotasQuery) error {
|
||||
}
|
||||
|
||||
func UpdateUserQuota(cmd *m.UpdateUserQuotaCmd) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
//Check if quota is already defined in the DB
|
||||
quota := m.Quota{
|
||||
Target: cmd.Target,
|
||||
|
@ -9,18 +9,21 @@ import (
|
||||
sqlite3 "github.com/mattn/go-sqlite3"
|
||||
)
|
||||
|
||||
type dbTransactionFunc func(sess *xorm.Session) error
|
||||
type dbTransactionFunc2 func(sess *session) error
|
||||
|
||||
type session struct {
|
||||
type DBSession struct {
|
||||
*xorm.Session
|
||||
events []interface{}
|
||||
}
|
||||
|
||||
func (sess *session) publishAfterCommit(msg interface{}) {
|
||||
type dbTransactionFunc func(sess *DBSession) error
|
||||
|
||||
func (sess *DBSession) publishAfterCommit(msg interface{}) {
|
||||
sess.events = append(sess.events, msg)
|
||||
}
|
||||
|
||||
func newSession() *DBSession {
|
||||
return &DBSession{Session: x.NewSession()}
|
||||
}
|
||||
|
||||
func inTransaction(callback dbTransactionFunc) error {
|
||||
return inTransactionWithRetry(callback, 0)
|
||||
}
|
||||
@ -28,7 +31,7 @@ func inTransaction(callback dbTransactionFunc) error {
|
||||
func inTransactionWithRetry(callback dbTransactionFunc, retry int) error {
|
||||
var err error
|
||||
|
||||
sess := x.NewSession()
|
||||
sess := newSession()
|
||||
defer sess.Close()
|
||||
|
||||
if err = sess.Begin(); err != nil {
|
||||
@ -54,28 +57,6 @@ func inTransactionWithRetry(callback dbTransactionFunc, retry int) error {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func inTransaction2(callback dbTransactionFunc2) error {
|
||||
var err error
|
||||
|
||||
sess := session{Session: x.NewSession()}
|
||||
|
||||
defer sess.Close()
|
||||
if err = sess.Begin(); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
err = callback(&sess)
|
||||
|
||||
if err != nil {
|
||||
sess.Rollback()
|
||||
return err
|
||||
} else if err = sess.Commit(); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if len(sess.events) > 0 {
|
||||
for _, e := range sess.events {
|
||||
if err = bus.Publish(e); err != nil {
|
||||
|
@ -12,7 +12,7 @@ func init() {
|
||||
bus.AddHandler("sql", InsertSqlTestData)
|
||||
}
|
||||
|
||||
func sqlRandomWalk(m1 string, m2 string, intWalker int64, floatWalker float64, sess *session) error {
|
||||
func sqlRandomWalk(m1 string, m2 string, intWalker int64, floatWalker float64, sess *DBSession) error {
|
||||
|
||||
timeWalker := time.Now().UTC().Add(time.Hour * -200)
|
||||
now := time.Now().UTC()
|
||||
@ -45,7 +45,7 @@ func sqlRandomWalk(m1 string, m2 string, intWalker int64, floatWalker float64, s
|
||||
}
|
||||
|
||||
func InsertSqlTestData(cmd *m.InsertSqlTestDataCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var err error
|
||||
|
||||
sqlog.Info("SQL TestData: Clearing previous test data")
|
||||
|
@ -199,7 +199,7 @@ func LoadConfig() {
|
||||
|
||||
if DbCfg.Type == "sqlite3" {
|
||||
UseSQLite3 = true
|
||||
// only allow one connection as sqlite3 has multi threading issues that casue table locks
|
||||
// only allow one connection as sqlite3 has multi threading issues that cause table locks
|
||||
// DbCfg.MaxIdleConn = 1
|
||||
// DbCfg.MaxOpenConn = 1
|
||||
}
|
||||
|
@ -1,8 +1,6 @@
|
||||
package sqlstore
|
||||
|
||||
import (
|
||||
"github.com/go-xorm/xorm"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -36,7 +34,7 @@ func StarDashboard(cmd *m.StarDashboardCommand) error {
|
||||
return m.ErrCommandValidationFailed
|
||||
}
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
entity := m.Star{
|
||||
UserId: cmd.UserId,
|
||||
@ -53,7 +51,7 @@ func UnstarDashboard(cmd *m.UnstarDashboardCommand) error {
|
||||
return m.ErrCommandValidationFailed
|
||||
}
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "DELETE FROM star WHERE user_id=? and dashboard_id=?"
|
||||
_, err := sess.Exec(rawSql, cmd.UserId, cmd.DashboardId)
|
||||
return err
|
||||
|
@ -3,7 +3,6 @@ package sqlstore
|
||||
import (
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
m "github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
@ -16,7 +15,7 @@ func init() {
|
||||
}
|
||||
|
||||
func UpdateTempUserStatus(cmd *m.UpdateTempUserStatusCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
var rawSql = "UPDATE temp_user SET status=? WHERE code=?"
|
||||
_, err := sess.Exec(rawSql, string(cmd.Status), cmd.Code)
|
||||
return err
|
||||
@ -24,7 +23,7 @@ func UpdateTempUserStatus(cmd *m.UpdateTempUserStatusCommand) error {
|
||||
}
|
||||
|
||||
func CreateTempUser(cmd *m.CreateTempUserCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
// create user
|
||||
user := &m.TempUser{
|
||||
|
@ -5,8 +5,6 @@ import (
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/go-xorm/xorm"
|
||||
|
||||
"fmt"
|
||||
|
||||
"github.com/grafana/grafana/pkg/bus"
|
||||
@ -34,7 +32,7 @@ func init() {
|
||||
bus.AddHandler("sql", SetUserHelpFlag)
|
||||
}
|
||||
|
||||
func getOrgIdForNewUser(cmd *m.CreateUserCommand, sess *session) (int64, error) {
|
||||
func getOrgIdForNewUser(cmd *m.CreateUserCommand, sess *DBSession) (int64, error) {
|
||||
if cmd.SkipOrgSetup {
|
||||
return -1, nil
|
||||
}
|
||||
@ -77,7 +75,7 @@ func getOrgIdForNewUser(cmd *m.CreateUserCommand, sess *session) (int64, error)
|
||||
}
|
||||
|
||||
func CreateUser(cmd *m.CreateUserCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
orgId, err := getOrgIdForNewUser(cmd, sess)
|
||||
if err != nil {
|
||||
return err
|
||||
@ -220,7 +218,7 @@ func GetUserByEmail(query *m.GetUserByEmailQuery) error {
|
||||
}
|
||||
|
||||
func UpdateUser(cmd *m.UpdateUserCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
user := m.User{
|
||||
Name: cmd.Name,
|
||||
@ -247,7 +245,7 @@ func UpdateUser(cmd *m.UpdateUserCommand) error {
|
||||
}
|
||||
|
||||
func ChangeUserPassword(cmd *m.ChangeUserPasswordCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
user := m.User{
|
||||
Password: cmd.NewPassword,
|
||||
@ -277,7 +275,7 @@ func SetUsingOrg(cmd *m.SetUsingOrgCommand) error {
|
||||
return fmt.Errorf("user does not belong to org")
|
||||
}
|
||||
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
user := m.User{}
|
||||
sess.Id(cmd.UserId).Get(&user)
|
||||
|
||||
@ -394,7 +392,7 @@ func SearchUsers(query *m.SearchUsersQuery) error {
|
||||
}
|
||||
|
||||
func DeleteUser(cmd *m.DeleteUserCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
deletes := []string{
|
||||
"DELETE FROM star WHERE user_id = ?",
|
||||
"DELETE FROM " + dialect.Quote("user") + " WHERE id = ?",
|
||||
@ -412,7 +410,7 @@ func DeleteUser(cmd *m.DeleteUserCommand) error {
|
||||
}
|
||||
|
||||
func UpdateUserPermissions(cmd *m.UpdateUserPermissionsCommand) error {
|
||||
return inTransaction(func(sess *xorm.Session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
user := m.User{}
|
||||
sess.Id(cmd.UserId).Get(&user)
|
||||
|
||||
@ -424,7 +422,7 @@ func UpdateUserPermissions(cmd *m.UpdateUserPermissionsCommand) error {
|
||||
}
|
||||
|
||||
func SetUserHelpFlag(cmd *m.SetUserHelpFlagCommand) error {
|
||||
return inTransaction2(func(sess *session) error {
|
||||
return inTransaction(func(sess *DBSession) error {
|
||||
|
||||
user := m.User{
|
||||
Id: cmd.UserId,
|
||||
|
@ -160,7 +160,7 @@ var (
|
||||
logger log.Logger
|
||||
|
||||
// Grafana.NET URL
|
||||
GrafanaNetUrl string
|
||||
GrafanaComUrl string
|
||||
|
||||
// S3 temp image store
|
||||
S3TempImageStoreBucketUrl string
|
||||
@ -306,7 +306,7 @@ func evalEnvVarExpression(value string) string {
|
||||
envVar = strings.TrimSuffix(envVar, "}")
|
||||
envValue := os.Getenv(envVar)
|
||||
|
||||
// if env variable is hostname and it is emtpy use os.Hostname as default
|
||||
// if env variable is hostname and it is empty use os.Hostname as default
|
||||
if envVar == "HOSTNAME" && envValue == "" {
|
||||
envValue, _ = os.Hostname()
|
||||
}
|
||||
@ -582,7 +582,11 @@ func NewConfigContext(args *CommandLineArgs) error {
|
||||
log.Warn("require_email_validation is enabled but smpt is disabled")
|
||||
}
|
||||
|
||||
GrafanaNetUrl = Cfg.Section("grafana_net").Key("url").MustString("https://grafana.com")
|
||||
// check old key name
|
||||
GrafanaComUrl = Cfg.Section("grafana_net").Key("url").MustString("")
|
||||
if GrafanaComUrl == "" {
|
||||
GrafanaComUrl = Cfg.Section("grafana_com").Key("url").MustString("https://grafana.com")
|
||||
}
|
||||
|
||||
imageUploadingSection := Cfg.Section("external_image_storage")
|
||||
ImageUploadProvider = imageUploadingSection.Key("provider").MustString("internal")
|
||||
@ -631,14 +635,14 @@ func LogConfigurationInfo() {
|
||||
|
||||
if len(appliedCommandLineProperties) > 0 {
|
||||
for _, prop := range appliedCommandLineProperties {
|
||||
logger.Info("Config overriden from command line", "arg", prop)
|
||||
logger.Info("Config overridden from command line", "arg", prop)
|
||||
}
|
||||
}
|
||||
|
||||
if len(appliedEnvOverrides) > 0 {
|
||||
text.WriteString("\tEnvironment variables used:\n")
|
||||
for _, prop := range appliedEnvOverrides {
|
||||
logger.Info("Config overriden from Environment variable", "var", prop)
|
||||
logger.Info("Config overridden from Environment variable", "var", prop)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -73,7 +73,7 @@ func TestLoadingSettings(t *testing.T) {
|
||||
So(Domain, ShouldEqual, "test2")
|
||||
})
|
||||
|
||||
Convey("Defaults can be overriden in specified config file", func() {
|
||||
Convey("Defaults can be overridden in specified config file", func() {
|
||||
NewConfigContext(&CommandLineArgs{
|
||||
HomePath: "../../",
|
||||
Config: filepath.Join(HomePath, "tests/config-files/override.ini"),
|
||||
@ -103,7 +103,7 @@ func TestLoadingSettings(t *testing.T) {
|
||||
So(DataPath, ShouldEqual, "/tmp/env_override")
|
||||
})
|
||||
|
||||
Convey("instance_name default to hostname even if hostname env is emtpy", func() {
|
||||
Convey("instance_name default to hostname even if hostname env is empty", func() {
|
||||
NewConfigContext(&CommandLineArgs{
|
||||
HomePath: "../../",
|
||||
})
|
||||
|
@ -2,7 +2,11 @@ package social
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"net/http"
|
||||
"strings"
|
||||
|
||||
"github.com/grafana/grafana/pkg/log"
|
||||
)
|
||||
|
||||
func isEmailAllowed(email string, allowedDomains []string) bool {
|
||||
@ -18,3 +22,25 @@ func isEmailAllowed(email string, allowedDomains []string) bool {
|
||||
|
||||
return valid
|
||||
}
|
||||
|
||||
func HttpGet(client *http.Client, url string) ([]byte, error) {
|
||||
r, err := client.Get(url)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
body, err := ioutil.ReadAll(r.Body)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if r.StatusCode >= 300 {
|
||||
return nil, fmt.Errorf(string(body))
|
||||
}
|
||||
|
||||
log.Trace("HTTP GET %s: %s %s", url, r.Status, string(body))
|
||||
|
||||
return body, nil
|
||||
}
|
||||
|
@ -4,7 +4,6 @@ import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"net/http"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@ -84,22 +83,14 @@ func (s *GenericOAuth) FetchPrivateEmail(client *http.Client) (string, error) {
|
||||
IsConfirmed bool `json:"is_confirmed"`
|
||||
}
|
||||
|
||||
emailsUrl := fmt.Sprintf(s.apiUrl + "/emails")
|
||||
r, err := client.Get(emailsUrl)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/emails"))
|
||||
if err != nil {
|
||||
return "", err
|
||||
return "", fmt.Errorf("Error getting email address: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
body, err := ioutil.ReadAll(r.Body)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
err = json.Unmarshal(body, records)
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
var data struct {
|
||||
Values []Record `json:"values"`
|
||||
@ -107,7 +98,7 @@ func (s *GenericOAuth) FetchPrivateEmail(client *http.Client) (string, error) {
|
||||
|
||||
err = json.Unmarshal(body, &data)
|
||||
if err != nil {
|
||||
return "", err
|
||||
return "", fmt.Errorf("Error getting email address: %s", err)
|
||||
}
|
||||
|
||||
records = data.Values
|
||||
@ -129,18 +120,16 @@ func (s *GenericOAuth) FetchTeamMemberships(client *http.Client) ([]int, error)
|
||||
Id int `json:"id"`
|
||||
}
|
||||
|
||||
membershipUrl := fmt.Sprintf(s.apiUrl + "/teams")
|
||||
r, err := client.Get(membershipUrl)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/teams"))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting team memberships: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&records); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting team memberships: %s", err)
|
||||
}
|
||||
|
||||
var ids = make([]int, len(records))
|
||||
@ -156,18 +145,16 @@ func (s *GenericOAuth) FetchOrganizations(client *http.Client) ([]string, error)
|
||||
Login string `json:"login"`
|
||||
}
|
||||
|
||||
url := fmt.Sprintf(s.apiUrl + "/orgs")
|
||||
r, err := client.Get(url)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/orgs"))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting organizations: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&records); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting organizations: %s", err)
|
||||
}
|
||||
|
||||
var logins = make([]string, len(records))
|
||||
@ -188,16 +175,14 @@ func (s *GenericOAuth) UserInfo(client *http.Client) (*BasicUserInfo, error) {
|
||||
Attributes map[string][]string `json:"attributes"`
|
||||
}
|
||||
|
||||
var err error
|
||||
r, err := client.Get(s.apiUrl)
|
||||
body, err := HttpGet(client, s.apiUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&data); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &data)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
userInfo := &BasicUserInfo{
|
||||
|
@ -85,18 +85,16 @@ func (s *SocialGithub) FetchPrivateEmail(client *http.Client) (string, error) {
|
||||
Verified bool `json:"verified"`
|
||||
}
|
||||
|
||||
emailsUrl := fmt.Sprintf(s.apiUrl + "/emails")
|
||||
r, err := client.Get(emailsUrl)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/emails"))
|
||||
if err != nil {
|
||||
return "", err
|
||||
return "", fmt.Errorf("Error getting email address: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&records); err != nil {
|
||||
return "", err
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
return "", fmt.Errorf("Error getting email address: %s", err)
|
||||
}
|
||||
|
||||
var email = ""
|
||||
@ -114,18 +112,16 @@ func (s *SocialGithub) FetchTeamMemberships(client *http.Client) ([]int, error)
|
||||
Id int `json:"id"`
|
||||
}
|
||||
|
||||
membershipUrl := fmt.Sprintf(s.apiUrl + "/teams")
|
||||
r, err := client.Get(membershipUrl)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/teams"))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting team memberships: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&records); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting team memberships: %s", err)
|
||||
}
|
||||
|
||||
var ids = make([]int, len(records))
|
||||
@ -141,18 +137,16 @@ func (s *SocialGithub) FetchOrganizations(client *http.Client) ([]string, error)
|
||||
Login string `json:"login"`
|
||||
}
|
||||
|
||||
url := fmt.Sprintf(s.apiUrl + "/orgs")
|
||||
r, err := client.Get(url)
|
||||
body, err := HttpGet(client, fmt.Sprintf(s.apiUrl+"/orgs"))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting organizations: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
var records []Record
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&records); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &records)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting organizations: %s", err)
|
||||
}
|
||||
|
||||
var logins = make([]string, len(records))
|
||||
@ -170,16 +164,14 @@ func (s *SocialGithub) UserInfo(client *http.Client) (*BasicUserInfo, error) {
|
||||
Email string `json:"email"`
|
||||
}
|
||||
|
||||
var err error
|
||||
r, err := client.Get(s.apiUrl)
|
||||
body, err := HttpGet(client, s.apiUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&data); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &data)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
userInfo := &BasicUserInfo{
|
||||
|
@ -2,6 +2,7 @@ package social
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@ -34,16 +35,17 @@ func (s *SocialGoogle) UserInfo(client *http.Client) (*BasicUserInfo, error) {
|
||||
Name string `json:"name"`
|
||||
Email string `json:"email"`
|
||||
}
|
||||
var err error
|
||||
|
||||
r, err := client.Get(s.apiUrl)
|
||||
body, err := HttpGet(client, s.apiUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
defer r.Body.Close()
|
||||
if err = json.NewDecoder(r.Body).Decode(&data); err != nil {
|
||||
return nil, err
|
||||
|
||||
err = json.Unmarshal(body, &data)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
return &BasicUserInfo{
|
||||
Name: data.Name,
|
||||
Email: data.Email,
|
||||
|
@ -2,6 +2,7 @@ package social
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@ -9,7 +10,7 @@ import (
|
||||
"golang.org/x/oauth2"
|
||||
)
|
||||
|
||||
type SocialGrafanaNet struct {
|
||||
type SocialGrafanaCom struct {
|
||||
*oauth2.Config
|
||||
url string
|
||||
allowedOrganizations []string
|
||||
@ -20,19 +21,19 @@ type OrgRecord struct {
|
||||
Login string `json:"login"`
|
||||
}
|
||||
|
||||
func (s *SocialGrafanaNet) Type() int {
|
||||
return int(models.GRAFANANET)
|
||||
func (s *SocialGrafanaCom) Type() int {
|
||||
return int(models.GRAFANA_COM)
|
||||
}
|
||||
|
||||
func (s *SocialGrafanaNet) IsEmailAllowed(email string) bool {
|
||||
func (s *SocialGrafanaCom) IsEmailAllowed(email string) bool {
|
||||
return true
|
||||
}
|
||||
|
||||
func (s *SocialGrafanaNet) IsSignupAllowed() bool {
|
||||
func (s *SocialGrafanaCom) IsSignupAllowed() bool {
|
||||
return s.allowSignup
|
||||
}
|
||||
|
||||
func (s *SocialGrafanaNet) IsOrganizationMember(organizations []OrgRecord) bool {
|
||||
func (s *SocialGrafanaCom) IsOrganizationMember(organizations []OrgRecord) bool {
|
||||
if len(s.allowedOrganizations) == 0 {
|
||||
return true
|
||||
}
|
||||
@ -48,7 +49,7 @@ func (s *SocialGrafanaNet) IsOrganizationMember(organizations []OrgRecord) bool
|
||||
return false
|
||||
}
|
||||
|
||||
func (s *SocialGrafanaNet) UserInfo(client *http.Client) (*BasicUserInfo, error) {
|
||||
func (s *SocialGrafanaCom) UserInfo(client *http.Client) (*BasicUserInfo, error) {
|
||||
var data struct {
|
||||
Name string `json:"name"`
|
||||
Login string `json:"username"`
|
||||
@ -57,16 +58,14 @@ func (s *SocialGrafanaNet) UserInfo(client *http.Client) (*BasicUserInfo, error)
|
||||
Orgs []OrgRecord `json:"orgs"`
|
||||
}
|
||||
|
||||
var err error
|
||||
r, err := client.Get(s.url + "/api/oauth2/user")
|
||||
body, err := HttpGet(client, s.url+"/api/oauth2/user")
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
defer r.Body.Close()
|
||||
|
||||
if err = json.NewDecoder(r.Body).Decode(&data); err != nil {
|
||||
return nil, err
|
||||
err = json.Unmarshal(body, &data)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error getting user info: %s", err)
|
||||
}
|
||||
|
||||
userInfo := &BasicUserInfo{
|
@ -47,7 +47,7 @@ func NewOAuthService() {
|
||||
setting.OAuthService = &setting.OAuther{}
|
||||
setting.OAuthService.OAuthInfos = make(map[string]*setting.OAuthInfo)
|
||||
|
||||
allOauthes := []string{"github", "google", "generic_oauth", "grafananet"}
|
||||
allOauthes := []string{"github", "google", "generic_oauth", "grafananet", "grafana_com"}
|
||||
|
||||
for _, name := range allOauthes {
|
||||
sec := setting.Cfg.Section("auth." + name)
|
||||
@ -72,6 +72,10 @@ func NewOAuthService() {
|
||||
continue
|
||||
}
|
||||
|
||||
if name == "grafananet" {
|
||||
name = "grafana_com"
|
||||
}
|
||||
|
||||
setting.OAuthService.OAuthInfos[name] = info
|
||||
|
||||
config := oauth2.Config{
|
||||
@ -120,21 +124,21 @@ func NewOAuthService() {
|
||||
}
|
||||
}
|
||||
|
||||
if name == "grafananet" {
|
||||
if name == "grafana_com" {
|
||||
config = oauth2.Config{
|
||||
ClientID: info.ClientId,
|
||||
ClientSecret: info.ClientSecret,
|
||||
Endpoint: oauth2.Endpoint{
|
||||
AuthURL: setting.GrafanaNetUrl + "/oauth2/authorize",
|
||||
TokenURL: setting.GrafanaNetUrl + "/api/oauth2/token",
|
||||
AuthURL: setting.GrafanaComUrl + "/oauth2/authorize",
|
||||
TokenURL: setting.GrafanaComUrl + "/api/oauth2/token",
|
||||
},
|
||||
RedirectURL: strings.TrimSuffix(setting.AppUrl, "/") + SocialBaseUrl + name,
|
||||
Scopes: info.Scopes,
|
||||
}
|
||||
|
||||
SocialMap["grafananet"] = &SocialGrafanaNet{
|
||||
SocialMap["grafana_com"] = &SocialGrafanaCom{
|
||||
Config: &config,
|
||||
url: setting.GrafanaNetUrl,
|
||||
url: setting.GrafanaComUrl,
|
||||
allowSignup: info.AllowSignup,
|
||||
allowedOrganizations: util.SplitString(sec.Key("allowed_organizations").String()),
|
||||
}
|
||||
|
@ -73,7 +73,7 @@ func (m *MySqlMacroEngine) EvaluateMacro(name string, args []string) (string, er
|
||||
if len(args) == 0 {
|
||||
return "", fmt.Errorf("missing time column argument for macro %v", name)
|
||||
}
|
||||
return fmt.Sprintf("%s > FROM_UNIXTIME(%d) AND %s < FROM_UNIXTIME(%d)", args[0], uint64(m.TimeRange.GetFromAsMsEpoch()/1000), args[0], uint64(m.TimeRange.GetToAsMsEpoch()/1000)), nil
|
||||
return fmt.Sprintf("%s >= FROM_UNIXTIME(%d) AND %s <= FROM_UNIXTIME(%d)", args[0], uint64(m.TimeRange.GetFromAsMsEpoch()/1000), args[0], uint64(m.TimeRange.GetToAsMsEpoch()/1000)), nil
|
||||
default:
|
||||
return "", fmt.Errorf("Unknown macro %v", name)
|
||||
}
|
||||
|
@ -36,7 +36,7 @@ func TestMacroEngine(t *testing.T) {
|
||||
sql, err := engine.Interpolate("WHERE $__timeFilter(time_column)")
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
So(sql, ShouldEqual, "WHERE time_column > FROM_UNIXTIME(18446744066914186738) AND time_column < FROM_UNIXTIME(18446744066914187038)")
|
||||
So(sql, ShouldEqual, "WHERE time_column >= FROM_UNIXTIME(18446744066914186738) AND time_column <= FROM_UNIXTIME(18446744066914187038)")
|
||||
})
|
||||
|
||||
})
|
||||
|
@ -52,8 +52,8 @@ func TestTimeRange(t *testing.T) {
|
||||
})
|
||||
|
||||
Convey("now-10m ", func() {
|
||||
fiveMinAgo, _ := time.ParseDuration("-10m")
|
||||
expected := now.Add(fiveMinAgo)
|
||||
tenMinAgo, _ := time.ParseDuration("-10m")
|
||||
expected := now.Add(tenMinAgo)
|
||||
res, err := tr.ParseTo()
|
||||
So(err, ShouldBeNil)
|
||||
So(res.Unix(), ShouldEqual, expected.Unix())
|
||||
|
58
public/app/core/components/collapse_box.ts
Normal file
58
public/app/core/components/collapse_box.ts
Normal file
@ -0,0 +1,58 @@
|
||||
///<reference path="../../headers/common.d.ts" />
|
||||
|
||||
import coreModule from 'app/core/core_module';
|
||||
|
||||
const template = `
|
||||
<div class="collapse-box">
|
||||
<div class="collapse-box__header">
|
||||
<a class="collapse-box__header-title pointer" ng-click="ctrl.toggle()">
|
||||
<span class="fa fa-fw fa-caret-right" ng-hide="ctrl.isOpen"></span>
|
||||
<span class="fa fa-fw fa-caret-down" ng-hide="!ctrl.isOpen"></span>
|
||||
{{ctrl.title}}
|
||||
</a>
|
||||
<div class="collapse-box__header-actions" ng-transclude="actions" ng-if="ctrl.isOpen"></div>
|
||||
</div>
|
||||
<div class="collapse-box__body" ng-transclude="body" ng-if="ctrl.isOpen">
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
export class CollapseBoxCtrl {
|
||||
isOpen: boolean;
|
||||
stateChanged: () => void;
|
||||
|
||||
/** @ngInject **/
|
||||
constructor(private $timeout) {
|
||||
this.isOpen = false;
|
||||
}
|
||||
|
||||
toggle() {
|
||||
this.isOpen = !this.isOpen;
|
||||
this.$timeout(() => {
|
||||
this.stateChanged();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export function collapseBox() {
|
||||
return {
|
||||
restrict: 'E',
|
||||
template: template,
|
||||
controller: CollapseBoxCtrl,
|
||||
bindToController: true,
|
||||
controllerAs: 'ctrl',
|
||||
scope: {
|
||||
"title": "@",
|
||||
"isOpen": "=?",
|
||||
"stateChanged": "&"
|
||||
},
|
||||
transclude: {
|
||||
'actions': '?collapseBoxActions',
|
||||
'body': 'collapseBoxBody',
|
||||
},
|
||||
link: function(scope, elem, attrs) {
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
coreModule.directive('collapseBox', collapseBox);
|
250
public/app/core/components/form_dropdown/form_dropdown.ts
Normal file
250
public/app/core/components/form_dropdown/form_dropdown.ts
Normal file
@ -0,0 +1,250 @@
|
||||
///<reference path="../../../headers/common.d.ts" />
|
||||
|
||||
import config from 'app/core/config';
|
||||
import _ from 'lodash';
|
||||
import $ from 'jquery';
|
||||
import coreModule from '../../core_module';
|
||||
|
||||
function typeaheadMatcher(item) {
|
||||
var str = this.query;
|
||||
if (str[0] === '/') { str = str.substring(1); }
|
||||
if (str[str.length - 1] === '/') { str = str.substring(0, str.length-1); }
|
||||
return item.toLowerCase().match(str.toLowerCase());
|
||||
}
|
||||
|
||||
export class FormDropdownCtrl {
|
||||
inputElement: any;
|
||||
linkElement: any;
|
||||
model: any;
|
||||
display: any;
|
||||
text: any;
|
||||
options: any;
|
||||
cssClass: any;
|
||||
cssClasses: any;
|
||||
allowCustom: any;
|
||||
labelMode: boolean;
|
||||
linkMode: boolean;
|
||||
cancelBlur: any;
|
||||
onChange: any;
|
||||
getOptions: any;
|
||||
optionCache: any;
|
||||
lookupText: boolean;
|
||||
|
||||
constructor(private $scope, $element, private $sce, private templateSrv, private $q) {
|
||||
this.inputElement = $element.find('input').first();
|
||||
this.linkElement = $element.find('a').first();
|
||||
this.linkMode = true;
|
||||
this.cancelBlur = null;
|
||||
|
||||
// listen to model changes
|
||||
$scope.$watch("ctrl.model", this.modelChanged.bind(this));
|
||||
|
||||
if (this.labelMode) {
|
||||
this.cssClasses = 'gf-form-label ' + this.cssClass;
|
||||
} else {
|
||||
this.cssClasses = 'gf-form-input gf-form-input--dropdown ' + this.cssClass;
|
||||
}
|
||||
|
||||
this.inputElement.attr('data-provide', 'typeahead');
|
||||
this.inputElement.typeahead({
|
||||
source: this.typeaheadSource.bind(this),
|
||||
minLength: 0,
|
||||
items: 10000,
|
||||
updater: this.typeaheadUpdater.bind(this),
|
||||
matcher: typeaheadMatcher,
|
||||
});
|
||||
|
||||
// modify typeahead lookup
|
||||
// this = typeahead
|
||||
var typeahead = this.inputElement.data('typeahead');
|
||||
typeahead.lookup = function () {
|
||||
this.query = this.$element.val() || '';
|
||||
var items = this.source(this.query, $.proxy(this.process, this));
|
||||
return items ? this.process(items) : items;
|
||||
};
|
||||
|
||||
this.linkElement.keydown(evt => {
|
||||
// trigger typeahead on down arrow or enter key
|
||||
if (evt.keyCode === 40 || evt.keyCode === 13) {
|
||||
this.linkElement.click();
|
||||
}
|
||||
});
|
||||
|
||||
this.inputElement.keydown(evt => {
|
||||
if (evt.keyCode === 13) {
|
||||
setTimeout(() => {
|
||||
this.inputElement.blur();
|
||||
}, 100);
|
||||
}
|
||||
});
|
||||
|
||||
this.inputElement.blur(this.inputBlur.bind(this));
|
||||
}
|
||||
|
||||
getOptionsInternal(query) {
|
||||
var result = this.getOptions({$query: query});
|
||||
if (this.isPromiseLike(result)) {
|
||||
return result;
|
||||
}
|
||||
return this.$q.when(result);
|
||||
}
|
||||
|
||||
isPromiseLike(obj) {
|
||||
return obj && (typeof obj.then === 'function');
|
||||
}
|
||||
|
||||
modelChanged() {
|
||||
if (_.isObject(this.model)) {
|
||||
this.updateDisplay(this.model.text);
|
||||
} else {
|
||||
// if we have text use it
|
||||
if (this.lookupText) {
|
||||
this.getOptionsInternal("").then(options => {
|
||||
var item = _.find(options, {value: this.model});
|
||||
this.updateDisplay(item ? item.text : this.model);
|
||||
});
|
||||
} else {
|
||||
this.updateDisplay(this.model);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
typeaheadSource(query, callback) {
|
||||
this.getOptionsInternal(query).then(options => {
|
||||
this.optionCache = options;
|
||||
|
||||
// extract texts
|
||||
let optionTexts = _.map(options, 'text');
|
||||
|
||||
// add custom values
|
||||
if (this.allowCustom) {
|
||||
if (_.indexOf(optionTexts, this.text) === -1) {
|
||||
options.unshift(this.text);
|
||||
}
|
||||
}
|
||||
|
||||
callback(optionTexts);
|
||||
});
|
||||
}
|
||||
|
||||
typeaheadUpdater(text) {
|
||||
if (text === this.text) {
|
||||
clearTimeout(this.cancelBlur);
|
||||
this.inputElement.focus();
|
||||
return text;
|
||||
}
|
||||
|
||||
this.inputElement.val(text);
|
||||
this.switchToLink(true);
|
||||
return text;
|
||||
}
|
||||
|
||||
switchToLink(fromClick) {
|
||||
if (this.linkMode && !fromClick) { return; }
|
||||
|
||||
clearTimeout(this.cancelBlur);
|
||||
this.cancelBlur = null;
|
||||
this.linkMode = true;
|
||||
this.inputElement.hide();
|
||||
this.linkElement.show();
|
||||
this.updateValue(this.inputElement.val());
|
||||
}
|
||||
|
||||
inputBlur() {
|
||||
// happens long before the click event on the typeahead options
|
||||
// need to have long delay because the blur
|
||||
this.cancelBlur = setTimeout(this.switchToLink.bind(this), 200);
|
||||
}
|
||||
|
||||
updateValue(text) {
|
||||
if (text === '' || this.text === text) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.$scope.$apply(() => {
|
||||
var option = _.find(this.optionCache, {text: text});
|
||||
|
||||
if (option) {
|
||||
if (_.isObject(this.model)) {
|
||||
this.model = option;
|
||||
} else {
|
||||
this.model = option.value;
|
||||
}
|
||||
this.text = option.text;
|
||||
} else if (this.allowCustom) {
|
||||
if (_.isObject(this.model)) {
|
||||
this.model.text = this.model.value = text;
|
||||
} else {
|
||||
this.model = text;
|
||||
}
|
||||
this.text = text;
|
||||
}
|
||||
|
||||
// needs to call this after digest so
|
||||
// property is synced with outerscope
|
||||
this.$scope.$$postDigest(() => {
|
||||
this.$scope.$apply(() => {
|
||||
this.onChange({$option: option});
|
||||
});
|
||||
});
|
||||
|
||||
});
|
||||
}
|
||||
|
||||
updateDisplay(text) {
|
||||
this.text = text;
|
||||
this.display = this.$sce.trustAsHtml(this.templateSrv.highlightVariablesAsHtml(text));
|
||||
}
|
||||
|
||||
open() {
|
||||
this.inputElement.show();
|
||||
|
||||
this.inputElement.css('width', (Math.max(this.linkElement.width(), 80) + 16) + 'px');
|
||||
this.inputElement.focus();
|
||||
|
||||
this.linkElement.hide();
|
||||
this.linkMode = false;
|
||||
|
||||
var typeahead = this.inputElement.data('typeahead');
|
||||
if (typeahead) {
|
||||
this.inputElement.val('');
|
||||
typeahead.lookup();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const template = `
|
||||
<input type="text"
|
||||
data-provide="typeahead"
|
||||
class="gf-form-input"
|
||||
spellcheck="false"
|
||||
style="display:none">
|
||||
</input>
|
||||
<a ng-class="ctrl.cssClasses"
|
||||
tabindex="1"
|
||||
ng-click="ctrl.open()"
|
||||
give-focus="ctrl.focus"
|
||||
ng-bind-html="ctrl.display">
|
||||
</a>
|
||||
`;
|
||||
|
||||
export function formDropdownDirective() {
|
||||
return {
|
||||
restrict: 'E',
|
||||
template: template,
|
||||
controller: FormDropdownCtrl,
|
||||
bindToController: true,
|
||||
controllerAs: 'ctrl',
|
||||
scope: {
|
||||
model: "=",
|
||||
getOptions: "&",
|
||||
onChange: "&",
|
||||
cssClass: "@",
|
||||
allowCustom: "@",
|
||||
labelMode: "@",
|
||||
lookupText: "@",
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
coreModule.directive('gfFormDropdown', formDropdownDirective);
|
@ -105,10 +105,14 @@ export function grafanaAppDirective(playlistSrv, contextSrv) {
|
||||
if (pageClass) {
|
||||
body.removeClass(pageClass);
|
||||
}
|
||||
pageClass = data.$$route.pageClass;
|
||||
if (pageClass) {
|
||||
body.addClass(pageClass);
|
||||
|
||||
if (data.$$route) {
|
||||
pageClass = data.$$route.pageClass;
|
||||
if (pageClass) {
|
||||
body.addClass(pageClass);
|
||||
}
|
||||
}
|
||||
|
||||
$("#tooltip, .tooltip").remove();
|
||||
|
||||
// check for kiosk url param
|
||||
@ -194,6 +198,15 @@ export function grafanaAppDirective(playlistSrv, contextSrv) {
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// hide menus
|
||||
var openMenus = body.find('.navbar-page-btn--open');
|
||||
if (openMenus.length > 0) {
|
||||
if (target.parents('.navbar-page-btn--open').length === 0) {
|
||||
openMenus.removeClass('navbar-page-btn--open');
|
||||
}
|
||||
}
|
||||
|
||||
// hide sidemenu
|
||||
if (!ignoreSideMenuHide && !contextSrv.pinned && body.find('.sidemenu').length > 0) {
|
||||
if (target.parents('.sidemenu').length === 0) {
|
||||
|
@ -1,7 +1,7 @@
|
||||
<div class="modal-body">
|
||||
<div class="modal-header">
|
||||
<h2 class="modal-header-title">
|
||||
<i class="fa fa-keyboard"></i>
|
||||
<i class="fa fa-keyboard-o"></i>
|
||||
<span class="p-l-1">Shortcuts</span>
|
||||
</h2>
|
||||
|
||||
@ -20,7 +20,7 @@
|
||||
|
||||
<div class="modal-content help-modal">
|
||||
|
||||
<p class="small" style="position: absolute; top: 48px; right: 10px">
|
||||
<p class="small" style="position: absolute; top: 13px; right: 44px">
|
||||
<span class="shortcut-table-key">mod</span> =
|
||||
<span class="muted">CTRL on windows or linux and CMD key on Mac</span>
|
||||
</p>
|
||||
|
113
public/app/core/components/json_explorer/helpers.ts
Normal file
113
public/app/core/components/json_explorer/helpers.ts
Normal file
@ -0,0 +1,113 @@
|
||||
// Based on work https://github.com/mohsen1/json-formatter-js
|
||||
// Licence MIT, Copyright (c) 2015 Mohsen Azimi
|
||||
|
||||
/*
|
||||
* Escapes `"` charachters from string
|
||||
*/
|
||||
function escapeString(str: string): string {
|
||||
return str.replace('"', '\"');
|
||||
}
|
||||
|
||||
/*
|
||||
* Determines if a value is an object
|
||||
*/
|
||||
export function isObject(value: any): boolean {
|
||||
var type = typeof value;
|
||||
return !!value && (type === 'object');
|
||||
}
|
||||
|
||||
/*
|
||||
* Gets constructor name of an object.
|
||||
* From http://stackoverflow.com/a/332429
|
||||
*
|
||||
*/
|
||||
export function getObjectName(object: Object): string {
|
||||
if (object === undefined) {
|
||||
return '';
|
||||
}
|
||||
if (object === null) {
|
||||
return 'Object';
|
||||
}
|
||||
if (typeof object === 'object' && !object.constructor) {
|
||||
return 'Object';
|
||||
}
|
||||
|
||||
const funcNameRegex = /function ([^(]*)/;
|
||||
const results = (funcNameRegex).exec((object).constructor.toString());
|
||||
if (results && results.length > 1) {
|
||||
return results[1];
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Gets type of an object. Returns "null" for null objects
|
||||
*/
|
||||
export function getType(object: Object): string {
|
||||
if (object === null) { return 'null'; }
|
||||
return typeof object;
|
||||
}
|
||||
|
||||
/*
|
||||
* Generates inline preview for a JavaScript object based on a value
|
||||
*/
|
||||
export function getValuePreview (object: Object, value: string): string {
|
||||
var type = getType(object);
|
||||
|
||||
if (type === 'null' || type === 'undefined') { return type; }
|
||||
|
||||
if (type === 'string') {
|
||||
value = '"' + escapeString(value) + '"';
|
||||
}
|
||||
if (type === 'function'){
|
||||
|
||||
// Remove content of the function
|
||||
return object.toString()
|
||||
.replace(/[\r\n]/g, '')
|
||||
.replace(/\{.*\}/, '') + '{…}';
|
||||
}
|
||||
return value;
|
||||
}
|
||||
|
||||
/*
|
||||
* Generates inline preview for a JavaScript object
|
||||
*/
|
||||
export function getPreview(object: string): string {
|
||||
let value = '';
|
||||
if (isObject(object)) {
|
||||
value = getObjectName(object);
|
||||
if (Array.isArray(object)) {
|
||||
value += '[' + object.length + ']';
|
||||
}
|
||||
} else {
|
||||
value = getValuePreview(object, object);
|
||||
}
|
||||
return value;
|
||||
}
|
||||
|
||||
/*
|
||||
* Generates a prefixed CSS class name
|
||||
*/
|
||||
export function cssClass(className: string): string {
|
||||
return `json-formatter-${className}`;
|
||||
}
|
||||
|
||||
/*
|
||||
* Creates a new DOM element wiht given type and class
|
||||
* TODO: move me to helpers
|
||||
*/
|
||||
export function createElement(type: string, className?: string, content?: Element|string): Element {
|
||||
const el = document.createElement(type);
|
||||
if (className) {
|
||||
el.classList.add(cssClass(className));
|
||||
}
|
||||
if (content !== undefined) {
|
||||
if (content instanceof Node) {
|
||||
el.appendChild(content);
|
||||
} else {
|
||||
el.appendChild(document.createTextNode(String(content)));
|
||||
}
|
||||
}
|
||||
return el;
|
||||
}
|
431
public/app/core/components/json_explorer/json_explorer.ts
Normal file
431
public/app/core/components/json_explorer/json_explorer.ts
Normal file
@ -0,0 +1,431 @@
|
||||
// Based on work https://github.com/mohsen1/json-formatter-js
|
||||
// Licence MIT, Copyright (c) 2015 Mohsen Azimi
|
||||
|
||||
import {
|
||||
isObject,
|
||||
getObjectName,
|
||||
getType,
|
||||
getValuePreview,
|
||||
getPreview,
|
||||
cssClass,
|
||||
createElement
|
||||
} from './helpers';
|
||||
|
||||
import _ from 'lodash';
|
||||
|
||||
const DATE_STRING_REGEX = /(^\d{1,4}[\.|\\/|-]\d{1,2}[\.|\\/|-]\d{1,4})(\s*(?:0?[1-9]:[0-5]|1(?=[012])\d:[0-5])\d\s*[ap]m)?$/;
|
||||
const PARTIAL_DATE_REGEX = /\d{2}:\d{2}:\d{2} GMT-\d{4}/;
|
||||
const JSON_DATE_REGEX = /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.\d{3}Z/;
|
||||
|
||||
// When toggleing, don't animated removal or addition of more than a few items
|
||||
const MAX_ANIMATED_TOGGLE_ITEMS = 10;
|
||||
|
||||
const requestAnimationFrame = window.requestAnimationFrame || function(cb: ()=>void) { cb(); return 0; };
|
||||
|
||||
export interface JsonExplorerConfig {
|
||||
animateOpen?: boolean;
|
||||
animateClose?: boolean;
|
||||
theme?: string;
|
||||
}
|
||||
|
||||
const _defaultConfig: JsonExplorerConfig = {
|
||||
animateOpen: true,
|
||||
animateClose: true,
|
||||
theme: null
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* @class JsonExplorer
|
||||
*
|
||||
* JsonExplorer allows you to render JSON objects in HTML with a
|
||||
* **collapsible** navigation.
|
||||
*/
|
||||
export class JsonExplorer {
|
||||
|
||||
// Hold the open state after the toggler is used
|
||||
private _isOpen: boolean = null;
|
||||
|
||||
// A reference to the element that we render to
|
||||
private element: Element;
|
||||
|
||||
private skipChildren = false;
|
||||
|
||||
/**
|
||||
* @param {object} json The JSON object you want to render. It has to be an
|
||||
* object or array. Do NOT pass raw JSON string.
|
||||
*
|
||||
* @param {number} [open=1] his number indicates up to how many levels the
|
||||
* rendered tree should expand. Set it to `0` to make the whole tree collapsed
|
||||
* or set it to `Infinity` to expand the tree deeply
|
||||
*
|
||||
* @param {object} [config=defaultConfig] -
|
||||
* defaultConfig = {
|
||||
* hoverPreviewEnabled: false,
|
||||
* hoverPreviewArrayCount: 100,
|
||||
* hoverPreviewFieldCount: 5
|
||||
* }
|
||||
*
|
||||
* Available configurations:
|
||||
* #####Hover Preview
|
||||
* * `hoverPreviewEnabled`: enable preview on hover
|
||||
* * `hoverPreviewArrayCount`: number of array items to show in preview Any
|
||||
* array larger than this number will be shown as `Array[XXX]` where `XXX`
|
||||
* is length of the array.
|
||||
* * `hoverPreviewFieldCount`: number of object properties to show for object
|
||||
* preview. Any object with more properties that thin number will be
|
||||
* truncated.
|
||||
*
|
||||
* @param {string} [key=undefined] The key that this object in it's parent
|
||||
* context
|
||||
*/
|
||||
constructor(public json: any, private open = 1, private config: JsonExplorerConfig = _defaultConfig, private key?: string) {
|
||||
}
|
||||
|
||||
/*
|
||||
* is formatter open?
|
||||
*/
|
||||
private get isOpen(): boolean {
|
||||
if (this._isOpen !== null) {
|
||||
return this._isOpen;
|
||||
} else {
|
||||
return this.open > 0;
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* set open state (from toggler)
|
||||
*/
|
||||
private set isOpen(value: boolean) {
|
||||
this._isOpen = value;
|
||||
}
|
||||
|
||||
/*
|
||||
* is this a date string?
|
||||
*/
|
||||
private get isDate(): boolean {
|
||||
return (this.type === 'string') &&
|
||||
(DATE_STRING_REGEX.test(this.json) ||
|
||||
JSON_DATE_REGEX.test(this.json) ||
|
||||
PARTIAL_DATE_REGEX.test(this.json));
|
||||
}
|
||||
|
||||
/*
|
||||
* is this a URL string?
|
||||
*/
|
||||
private get isUrl(): boolean {
|
||||
return this.type === 'string' && (this.json.indexOf('http') === 0);
|
||||
}
|
||||
|
||||
/*
|
||||
* is this an array?
|
||||
*/
|
||||
private get isArray(): boolean {
|
||||
return Array.isArray(this.json);
|
||||
}
|
||||
|
||||
/*
|
||||
* is this an object?
|
||||
* Note: In this context arrays are object as well
|
||||
*/
|
||||
private get isObject(): boolean {
|
||||
return isObject(this.json);
|
||||
}
|
||||
|
||||
/*
|
||||
* is this an empty object with no properties?
|
||||
*/
|
||||
private get isEmptyObject(): boolean {
|
||||
return !this.keys.length && !this.isArray;
|
||||
}
|
||||
|
||||
/*
|
||||
* is this an empty object or array?
|
||||
*/
|
||||
private get isEmpty(): boolean {
|
||||
return this.isEmptyObject || (this.keys && !this.keys.length && this.isArray);
|
||||
}
|
||||
|
||||
/*
|
||||
* did we recieve a key argument?
|
||||
* This means that the formatter was called as a sub formatter of a parent formatter
|
||||
*/
|
||||
private get hasKey(): boolean {
|
||||
return typeof this.key !== 'undefined';
|
||||
}
|
||||
|
||||
/*
|
||||
* if this is an object, get constructor function name
|
||||
*/
|
||||
private get constructorName(): string {
|
||||
return getObjectName(this.json);
|
||||
}
|
||||
|
||||
/*
|
||||
* get type of this value
|
||||
* Possible values: all JavaScript primitive types plus "array" and "null"
|
||||
*/
|
||||
private get type(): string {
|
||||
return getType(this.json);
|
||||
}
|
||||
|
||||
/*
|
||||
* get object keys
|
||||
* If there is an empty key we pad it wit quotes to make it visible
|
||||
*/
|
||||
private get keys(): string[] {
|
||||
if (this.isObject) {
|
||||
return Object.keys(this.json).map((key)=> key ? key : '""');
|
||||
} else {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggles `isOpen` state
|
||||
*
|
||||
*/
|
||||
toggleOpen() {
|
||||
this.isOpen = !this.isOpen;
|
||||
|
||||
if (this.element) {
|
||||
if (this.isOpen) {
|
||||
this.appendChildren(this.config.animateOpen);
|
||||
} else{
|
||||
this.removeChildren(this.config.animateClose);
|
||||
}
|
||||
this.element.classList.toggle(cssClass('open'));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Open all children up to a certain depth.
|
||||
* Allows actions such as expand all/collapse all
|
||||
*
|
||||
*/
|
||||
openAtDepth(depth = 1) {
|
||||
if (depth < 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.open = depth;
|
||||
this.isOpen = (depth !== 0);
|
||||
|
||||
if (this.element) {
|
||||
this.removeChildren(false);
|
||||
|
||||
if (depth === 0) {
|
||||
this.element.classList.remove(cssClass('open'));
|
||||
} else {
|
||||
this.appendChildren(this.config.animateOpen);
|
||||
this.element.classList.add(cssClass('open'));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
isNumberArray() {
|
||||
return (this.json.length > 0 && this.json.length < 4) &&
|
||||
(_.isNumber(this.json[0]) || _.isNumber(this.json[1]));
|
||||
}
|
||||
|
||||
renderArray() {
|
||||
const arrayWrapperSpan = createElement('span');
|
||||
arrayWrapperSpan.appendChild(createElement('span', 'bracket', '['));
|
||||
|
||||
// some pretty handling of number arrays
|
||||
if (this.isNumberArray()) {
|
||||
this.json.forEach((val, index) => {
|
||||
if (index > 0) {
|
||||
arrayWrapperSpan.appendChild(createElement('span', 'array-comma', ','));
|
||||
}
|
||||
arrayWrapperSpan.appendChild(createElement('span', 'number', val));
|
||||
});
|
||||
this.skipChildren = true;
|
||||
} else {
|
||||
arrayWrapperSpan.appendChild(createElement('span', 'number', (this.json.length)));
|
||||
}
|
||||
|
||||
arrayWrapperSpan.appendChild(createElement('span', 'bracket', ']'));
|
||||
return arrayWrapperSpan;
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders an HTML element and installs event listeners
|
||||
*
|
||||
* @returns {HTMLDivElement}
|
||||
*/
|
||||
render(skipRoot = false): HTMLDivElement {
|
||||
// construct the root element and assign it to this.element
|
||||
this.element = createElement('div', 'row');
|
||||
|
||||
// construct the toggler link
|
||||
const togglerLink = createElement('a', 'toggler-link');
|
||||
const togglerIcon = createElement('span', 'toggler');
|
||||
|
||||
// if this is an object we need a wrapper span (toggler)
|
||||
if (this.isObject) {
|
||||
togglerLink.appendChild(togglerIcon);
|
||||
}
|
||||
|
||||
// if this is child of a parent formatter we need to append the key
|
||||
if (this.hasKey) {
|
||||
togglerLink.appendChild(createElement('span', 'key', `${this.key}:`));
|
||||
}
|
||||
|
||||
// Value for objects and arrays
|
||||
if (this.isObject) {
|
||||
// construct the value holder element
|
||||
const value = createElement('span', 'value');
|
||||
|
||||
// we need a wrapper span for objects
|
||||
const objectWrapperSpan = createElement('span');
|
||||
|
||||
// get constructor name and append it to wrapper span
|
||||
var constructorName = createElement('span', 'constructor-name', this.constructorName);
|
||||
objectWrapperSpan.appendChild(constructorName);
|
||||
|
||||
// if it's an array append the array specific elements like brackets and length
|
||||
if (this.isArray) {
|
||||
const arrayWrapperSpan = this.renderArray();
|
||||
objectWrapperSpan.appendChild(arrayWrapperSpan);
|
||||
}
|
||||
|
||||
// append object wrapper span to toggler link
|
||||
value.appendChild(objectWrapperSpan);
|
||||
togglerLink.appendChild(value);
|
||||
// Primitive values
|
||||
} else {
|
||||
|
||||
// make a value holder element
|
||||
const value = this.isUrl ? createElement('a') : createElement('span');
|
||||
|
||||
// add type and other type related CSS classes
|
||||
value.classList.add(cssClass(this.type));
|
||||
if (this.isDate) {
|
||||
value.classList.add(cssClass('date'));
|
||||
}
|
||||
if (this.isUrl) {
|
||||
value.classList.add(cssClass('url'));
|
||||
value.setAttribute('href', this.json);
|
||||
}
|
||||
|
||||
// Append value content to value element
|
||||
const valuePreview = getValuePreview(this.json, this.json);
|
||||
value.appendChild(document.createTextNode(valuePreview));
|
||||
|
||||
// append the value element to toggler link
|
||||
togglerLink.appendChild(value);
|
||||
}
|
||||
|
||||
// construct a children element
|
||||
const children = createElement('div', 'children');
|
||||
|
||||
// set CSS classes for children
|
||||
if (this.isObject) {
|
||||
children.classList.add(cssClass('object'));
|
||||
}
|
||||
if (this.isArray) {
|
||||
children.classList.add(cssClass('array'));
|
||||
}
|
||||
if (this.isEmpty) {
|
||||
children.classList.add(cssClass('empty'));
|
||||
}
|
||||
|
||||
// set CSS classes for root element
|
||||
if (this.config && this.config.theme) {
|
||||
this.element.classList.add(cssClass(this.config.theme));
|
||||
}
|
||||
if (this.isOpen) {
|
||||
this.element.classList.add(cssClass('open'));
|
||||
}
|
||||
|
||||
// append toggler and children elements to root element
|
||||
if (!skipRoot) {
|
||||
this.element.appendChild(togglerLink);
|
||||
}
|
||||
|
||||
if (!this.skipChildren) {
|
||||
this.element.appendChild(children);
|
||||
} else {
|
||||
// remove togglerIcon
|
||||
togglerLink.removeChild(togglerIcon);
|
||||
}
|
||||
|
||||
// if formatter is set to be open call appendChildren
|
||||
if (this.isObject && this.isOpen) {
|
||||
this.appendChildren();
|
||||
}
|
||||
|
||||
// add event listener for toggling
|
||||
if (this.isObject) {
|
||||
togglerLink.addEventListener('click', this.toggleOpen.bind(this));
|
||||
}
|
||||
|
||||
return this.element as HTMLDivElement;
|
||||
}
|
||||
|
||||
/**
|
||||
* Appends all the children to children element
|
||||
* Animated option is used when user triggers this via a click
|
||||
*/
|
||||
appendChildren(animated = false) {
|
||||
const children = this.element.querySelector(`div.${cssClass('children')}`);
|
||||
|
||||
if (!children || this.isEmpty) { return; }
|
||||
|
||||
if (animated) {
|
||||
let index = 0;
|
||||
const addAChild = ()=> {
|
||||
const key = this.keys[index];
|
||||
const formatter = new JsonExplorer(this.json[key], this.open - 1, this.config, key);
|
||||
children.appendChild(formatter.render());
|
||||
|
||||
index += 1;
|
||||
|
||||
if (index < this.keys.length) {
|
||||
if (index > MAX_ANIMATED_TOGGLE_ITEMS) {
|
||||
addAChild();
|
||||
} else {
|
||||
requestAnimationFrame(addAChild);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
requestAnimationFrame(addAChild);
|
||||
|
||||
} else {
|
||||
this.keys.forEach(key => {
|
||||
const formatter = new JsonExplorer(this.json[key], this.open - 1, this.config, key);
|
||||
children.appendChild(formatter.render());
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Removes all the children from children element
|
||||
* Animated option is used when user triggers this via a click
|
||||
*/
|
||||
removeChildren(animated = false) {
|
||||
const childrenElement = this.element.querySelector(`div.${cssClass('children')}`) as HTMLDivElement;
|
||||
|
||||
if (animated) {
|
||||
let childrenRemoved = 0;
|
||||
const removeAChild = ()=> {
|
||||
if (childrenElement && childrenElement.children.length) {
|
||||
childrenElement.removeChild(childrenElement.children[0]);
|
||||
childrenRemoved += 1;
|
||||
if (childrenRemoved > MAX_ANIMATED_TOGGLE_ITEMS) {
|
||||
removeAChild();
|
||||
} else {
|
||||
requestAnimationFrame(removeAChild);
|
||||
}
|
||||
}
|
||||
};
|
||||
requestAnimationFrame(removeAChild);
|
||||
} else {
|
||||
if (childrenElement) {
|
||||
childrenElement.innerHTML = '';
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -8,11 +8,36 @@
|
||||
<i class="fa fa-chevron-left"></i>
|
||||
</a>
|
||||
|
||||
<a href="{{ctrl.titleUrl}}" class="navbar-page-btn" ng-show="ctrl.title">
|
||||
<i class="{{ctrl.icon}}" ng-show="ctrl.icon"></i>
|
||||
<img ng-src="{{ctrl.iconUrl}}" ng-show="ctrl.iconUrl"></i>
|
||||
{{ctrl.title}}
|
||||
</a>
|
||||
<!-- <a class="navbar-page-btn navbar-page-btn--search" ng-click="ctrl.showSearch()"> -->
|
||||
<!-- <i class="fa fa-search"></i> -->
|
||||
<!-- </a> -->
|
||||
|
||||
<div ng-transclude></div>
|
||||
<div ng-if="::!ctrl.hasMenu">
|
||||
<a href="{{::ctrl.section.url}}" class="navbar-page-btn">
|
||||
<i class="{{::ctrl.section.icon}}" ng-show="::ctrl.section.icon"></i>
|
||||
<img ng-src="{{::ctrl.section.iconUrl}}" ng-show="::ctrl.section.iconUrl"></i>
|
||||
{{::ctrl.section.title}}
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="dropdown navbar-section-wrapper" ng-if="::ctrl.hasMenu">
|
||||
<a href="{{::ctrl.section.url}}" class="navbar-page-btn" data-toggle="dropdown">
|
||||
<i class="{{::ctrl.section.icon}}" ng-show="::ctrl.section.icon"></i>
|
||||
<img ng-src="{{::ctrl.section.iconUrl}}" ng-show="::ctrl.section.iconUrl"></i>
|
||||
{{::ctrl.section.title}}
|
||||
<i class="fa fa-caret-down"></i>
|
||||
</a>
|
||||
<ul class="dropdown-menu dropdown-menu--navbar">
|
||||
<li ng-repeat="navItem in ::ctrl.model.menu" ng-class="{active: navItem.active}">
|
||||
<a class="pointer" ng-href="{{::navItem.url}}" ng-click="ctrl.navItemClicked(navItem, $event)">
|
||||
<i class="{{::navItem.icon}}" ng-show="::navItem.icon"></i>
|
||||
{{::navItem.title}}
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
<div ng-transclude></div>
|
||||
</div>
|
||||
|
||||
<dashboard-search></dashboard-search>
|
||||
|
@ -4,10 +4,28 @@ import config from 'app/core/config';
|
||||
import _ from 'lodash';
|
||||
import $ from 'jquery';
|
||||
import coreModule from '../../core_module';
|
||||
import {NavModel, NavModelItem} from '../../nav_model_srv';
|
||||
|
||||
export class NavbarCtrl {
|
||||
model: NavModel;
|
||||
section: NavModelItem;
|
||||
hasMenu: boolean;
|
||||
|
||||
/** @ngInject */
|
||||
constructor(private $scope, private contextSrv) {
|
||||
constructor(private $scope, private $rootScope, private contextSrv) {
|
||||
this.section = this.model.section;
|
||||
this.hasMenu = this.model.menu.length > 0;
|
||||
}
|
||||
|
||||
showSearch() {
|
||||
this.$rootScope.appEvent('show-dash-search');
|
||||
}
|
||||
|
||||
navItemClicked(navItem, evt) {
|
||||
if (navItem.clickHandler) {
|
||||
navItem.clickHandler();
|
||||
evt.preventDefault();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -20,12 +38,9 @@ export function navbarDirective() {
|
||||
transclude: true,
|
||||
controllerAs: 'ctrl',
|
||||
scope: {
|
||||
title: "@",
|
||||
titleUrl: "@",
|
||||
iconUrl: "@",
|
||||
model: "=",
|
||||
},
|
||||
link: function(scope, elem, attrs, ctrl) {
|
||||
ctrl.icon = attrs.icon;
|
||||
link: function(scope, elem) {
|
||||
elem.addClass('navbar');
|
||||
}
|
||||
};
|
||||
|
@ -1,9 +1,22 @@
|
||||
|
||||
<div class="search-backdrop" ng-if="ctrl.isOpen"></div>
|
||||
|
||||
<div class="search-container" ng-if="ctrl.isOpen">
|
||||
|
||||
<div class="search-field-wrapper">
|
||||
<span style="position: relative;">
|
||||
<input type="text" placeholder="Find dashboards by name" give-focus="ctrl.giveSearchFocus" tabindex="1"
|
||||
ng-keydown="ctrl.keyDown($event)" ng-model="ctrl.query.query" ng-model-options="{ debounce: 500 }" spellcheck='false' ng-change="ctrl.search()" />
|
||||
</span>
|
||||
<div class="search-field-icon pointer" ng-click="ctrl.closeSearch()">
|
||||
<i class="fa fa-search"></i>
|
||||
</div>
|
||||
|
||||
<input type="text" placeholder="Find dashboards by name" give-focus="ctrl.giveSearchFocus" tabindex="1"
|
||||
ng-keydown="ctrl.keyDown($event)"
|
||||
ng-model="ctrl.query.query"
|
||||
ng-model-options="{ debounce: 500 }"
|
||||
spellcheck='false'
|
||||
ng-change="ctrl.search()"
|
||||
ng-blur="ctrl.searchInputBlur()"
|
||||
/>
|
||||
|
||||
<div class="search-switches">
|
||||
<i class="fa fa-filter"></i>
|
||||
<a class="pointer" href="javascript:void 0;" ng-click="ctrl.showStarred()" tabindex="2">
|
||||
@ -24,54 +37,55 @@
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<div class="search-field-spacer"></div>
|
||||
</div>
|
||||
|
||||
<div class="search-results-container" ng-if="ctrl.tagsMode">
|
||||
<div ng-repeat="tag in ctrl.results" class="pointer" style="width: 180px; float: left;"
|
||||
ng-class="{'selected': $index === ctrl.selectedIndex }"
|
||||
ng-click="ctrl.filterByTag(tag.term, $event)">
|
||||
<a class="search-result-tag label label-tag" tag-color-from-name="tag.term">
|
||||
<i class="fa fa-tag"></i>
|
||||
<span>{{tag.term}} ({{tag.count}})</span>
|
||||
<div class="search-dropdown" ng-class="{'search-dropdown--fade-in': ctrl.openCompleted}">
|
||||
<div class="search-results-container" ng-if="ctrl.tagsMode">
|
||||
<div ng-repeat="tag in ctrl.results" class="pointer" style="width: 180px; float: left;"
|
||||
ng-class="{'selected': $index === ctrl.selectedIndex }"
|
||||
ng-click="ctrl.filterByTag(tag.term, $event)">
|
||||
<a class="search-result-tag label label-tag" tag-color-from-name="tag.term">
|
||||
<i class="fa fa-tag"></i>
|
||||
<span>{{tag.term}} ({{tag.count}})</span>
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="search-results-container" ng-if="!ctrl.tagsMode">
|
||||
<h6 ng-hide="ctrl.results.length">No dashboards matching your query were found.</h6>
|
||||
|
||||
<a class="search-item pointer search-item-{{row.type}}" bindonce ng-repeat="row in ctrl.results"
|
||||
ng-class="{'selected': $index == ctrl.selectedIndex}" ng-href="{{row.url}}">
|
||||
|
||||
<span class="search-result-tags">
|
||||
<span ng-click="ctrl.filterByTag(tag, $event)" ng-repeat="tag in row.tags" tag-color-from-name="tag" class="label label-tag">
|
||||
{{tag}}
|
||||
</span>
|
||||
<i class="fa" ng-class="{'fa-star': row.isStarred, 'fa-star-o': !row.isStarred}"></i>
|
||||
</span>
|
||||
|
||||
<span class="search-result-link">
|
||||
<i class="fa search-result-icon"></i>
|
||||
<span bo-text="row.title"></span>
|
||||
</span>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="search-button-row">
|
||||
<a class="btn btn-secondary" href="dashboard/new" ng-show="ctrl.contextSrv.isEditor" ng-click="ctrl.isOpen = false;">
|
||||
<i class="fa fa-plus"></i> New Dashboard
|
||||
</a>
|
||||
|
||||
<a class="btn btn-inverse" href="dashboard/new/?editview=import" ng-show="ctrl.contextSrv.isEditor" ng-click="ctrl.isOpen = false;">
|
||||
<i class="fa fa-upload"></i> Import Dashboard
|
||||
</a>
|
||||
|
||||
<a class="search-button-row-explore-link" target="_blank" href="https://grafana.com/dashboards?utm_source=grafana_search">
|
||||
Find <img src="public/img/icn-dashboard-tiny.svg" width="14" /> dashboards on Grafana.com
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="search-results-container" ng-if="!ctrl.tagsMode">
|
||||
<h6 ng-hide="ctrl.results.length">No dashboards matching your query were found.</h6>
|
||||
|
||||
<a class="search-item pointer search-item-{{row.type}}" bindonce ng-repeat="row in ctrl.results"
|
||||
ng-class="{'selected': $index == ctrl.selectedIndex}" ng-href="{{row.url}}">
|
||||
|
||||
<span class="search-result-tags">
|
||||
<span ng-click="ctrl.filterByTag(tag, $event)" ng-repeat="tag in row.tags" tag-color-from-name="tag" class="label label-tag">
|
||||
{{tag}}
|
||||
</span>
|
||||
<i class="fa" ng-class="{'fa-star': row.isStarred, 'fa-star-o': !row.isStarred}"></i>
|
||||
</span>
|
||||
|
||||
<span class="search-result-link">
|
||||
<i class="fa search-result-icon"></i>
|
||||
<span bo-text="row.title"></span>
|
||||
</span>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="search-button-row">
|
||||
<a class="btn btn-inverse pull-left" href="dashboard/new" ng-show="ctrl.contextSrv.isEditor" ng-click="ctrl.isOpen = false;">
|
||||
<i class="fa fa-plus"></i>
|
||||
Create New
|
||||
</a>
|
||||
|
||||
<a class="btn btn-inverse pull-left" href="dashboard/new/?editview=import" ng-show="ctrl.contextSrv.isEditor" ng-click="ctrl.isOpen = false;">
|
||||
<i class="fa fa-upload"></i>
|
||||
Import
|
||||
</a>
|
||||
|
||||
<a class="pull-right search-button-row-explore-link" target="_blank" href="https://grafana.com/dashboards?utm_source=grafana_search">
|
||||
Find <img src="public/img/icn-dashboard-tiny.svg" width="14" /> dashboards on Grafana.com
|
||||
</a>
|
||||
|
||||
<div class="clearfix"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user