Merge branch 'master' into backend_plugins

* master: (117 commits)
  fix: share snapshot controller was missing ngInject comment, fixes #10511
  Use URLEncoding instead of StdEncoding to be sure state value will be corectly decoded (#10512)
  Optimize metrics and notifications docs
  Optimize cli and provisioning docs
  docs: Guide for IIS reverse proxy
  changelog: adds note about closing #9645
  telegram: Send notifications with an inline image
  telegram: Switch to using multipart form rather than JSON as a body
  telegram: Fix a typo in variable name
  fix: alert list pause/start toggle was not working properly
  fix template variable selector overlap by the panel (#10493)
  dashboard: Close/hide 'Add Panel' before saving a dashboard (#10482)
  fix: removed unused param
  Fix variables values passing when both repeat rows and panels is used (#10488)
  moved angular-mocks out of dependencies
  ux: minor change to alert list page
  ux: minor word change to alert list
  fix: updated snapshot test
  Add eu-west-3 in cloudwatch datasource default's region (#10477)
  fix: Make sure orig files are not added to git again #10289
  ...
This commit is contained in:
bergquist 2018-01-15 10:27:12 +01:00
commit 5499f4910c
458 changed files with 9766 additions and 10236 deletions

View File

@ -7,6 +7,8 @@ indent_size = 2
charset = utf-8 charset = utf-8
trim_trailing_whitespace = true trim_trailing_whitespace = true
insert_final_newline = true insert_final_newline = true
max_line_length = 120
insert_final_newline = true
[*.go] [*.go]
indent_style = tab indent_style = tab

1
.gitignore vendored
View File

@ -60,3 +60,4 @@ debug.test
/vendor/**/*_test.go /vendor/**/*_test.go
/vendor/**/.editorconfig /vendor/**/.editorconfig
/vendor/**/appengine* /vendor/**/appengine*
*.orig

View File

@ -25,6 +25,8 @@ Dashboard panels and rows are positioned using a gridPos object `{x: 0, y: 0, w:
Config files for provisioning datasources as configuration have changed from `/conf/datasources` to `/conf/provisioning/datasources`. Config files for provisioning datasources as configuration have changed from `/conf/datasources` to `/conf/provisioning/datasources`.
From `/etc/grafana/datasources` to `/etc/grafana/provisioning/datasources` when installed with deb/rpm packages. From `/etc/grafana/datasources` to `/etc/grafana/provisioning/datasources` when installed with deb/rpm packages.
The pagerduty notifier now defaults to not auto resolve incidents. More details at [#10222](https://github.com/grafana/grafana/issues/10222)
## New Features ## New Features
* **Data Source Proxy**: Add support for whitelisting specified cookies that will be passed through to the data source when proxying data source requests [#5457](https://github.com/grafana/grafana/issues/5457), thanks [@robingustafsson](https://github.com/robingustafsson) * **Data Source Proxy**: Add support for whitelisting specified cookies that will be passed through to the data source when proxying data source requests [#5457](https://github.com/grafana/grafana/issues/5457), thanks [@robingustafsson](https://github.com/robingustafsson)
* **Postgres/MySQL**: add __timeGroup macro for mysql [#9596](https://github.com/grafana/grafana/pull/9596), thanks [@svenklemm](https://github.com/svenklemm) * **Postgres/MySQL**: add __timeGroup macro for mysql [#9596](https://github.com/grafana/grafana/pull/9596), thanks [@svenklemm](https://github.com/svenklemm)
@ -36,6 +38,7 @@ From `/etc/grafana/datasources` to `/etc/grafana/provisioning/datasources` when
* **Dashboard as cfg**: Load dashboards from file into Grafana on startup/change [#9654](https://github.com/grafana/grafana/issues/9654) [#5269](https://github.com/grafana/grafana/issues/5269) * **Dashboard as cfg**: Load dashboards from file into Grafana on startup/change [#9654](https://github.com/grafana/grafana/issues/9654) [#5269](https://github.com/grafana/grafana/issues/5269)
* **Prometheus**: Grafana can now send alerts to Prometheus Alertmanager while firing [#7481](https://github.com/grafana/grafana/issues/7481), thx [@Thib17](https://github.com/Thib17) and [@mtanda](https://github.com/mtanda) * **Prometheus**: Grafana can now send alerts to Prometheus Alertmanager while firing [#7481](https://github.com/grafana/grafana/issues/7481), thx [@Thib17](https://github.com/Thib17) and [@mtanda](https://github.com/mtanda)
* **Table**: Support multiple table formated queries in table panel [#9170](https://github.com/grafana/grafana/issues/9170), thx [@davkal](https://github.com/davkal) * **Table**: Support multiple table formated queries in table panel [#9170](https://github.com/grafana/grafana/issues/9170), thx [@davkal](https://github.com/davkal)
## Minor ## Minor
* **Alert panel**: Adds placeholder text when no alerts are within the time range [#9624](https://github.com/grafana/grafana/issues/9624), thx [@straend](https://github.com/straend) * **Alert panel**: Adds placeholder text when no alerts are within the time range [#9624](https://github.com/grafana/grafana/issues/9624), thx [@straend](https://github.com/straend)
* **Mysql**: MySQL enable MaxOpenCon and MaxIdleCon regards how constring is configured. [#9784](https://github.com/grafana/grafana/issues/9784), thx [@dfredell](https://github.com/dfredell) * **Mysql**: MySQL enable MaxOpenCon and MaxIdleCon regards how constring is configured. [#9784](https://github.com/grafana/grafana/issues/9784), thx [@dfredell](https://github.com/dfredell)
@ -46,6 +49,8 @@ From `/etc/grafana/datasources` to `/etc/grafana/provisioning/datasources` when
* **Github**: Use organizations_url provided from github to verify user belongs in org. [#10111](https://github.com/grafana/grafana/issues/10111), thx * **Github**: Use organizations_url provided from github to verify user belongs in org. [#10111](https://github.com/grafana/grafana/issues/10111), thx
[@adiletmaratov](https://github.com/adiletmaratov) [@adiletmaratov](https://github.com/adiletmaratov)
* **Backend**: Fixed bug where Grafana exited before all sub routines where finished [#10131](https://github.com/grafana/grafana/issues/10131) * **Backend**: Fixed bug where Grafana exited before all sub routines where finished [#10131](https://github.com/grafana/grafana/issues/10131)
* **Azure**: Adds support for Azure blob storage as external image stor [#8955](https://github.com/grafana/grafana/issues/8955), thx [@saada](https://github.com/saada)
* **Telegram**: Add support for inline image uploads to telegram notifier plugin [#9967](https://github.com/grafana/grafana/pull/9967), thx [@rburchell](https://github.com/rburchell)
## Tech ## Tech
* **RabbitMq**: Remove support for publishing events to RabbitMQ [#9645](https://github.com/grafana/grafana/issues/9645) * **RabbitMq**: Remove support for publishing events to RabbitMQ [#9645](https://github.com/grafana/grafana/issues/9645)
@ -55,6 +60,7 @@ From `/etc/grafana/datasources` to `/etc/grafana/provisioning/datasources` when
* **Sensu**: Send alert message to sensu output [#9551](https://github.com/grafana/grafana/issues/9551), thx [@cjchand](https://github.com/cjchand) * **Sensu**: Send alert message to sensu output [#9551](https://github.com/grafana/grafana/issues/9551), thx [@cjchand](https://github.com/cjchand)
* **Singlestat**: suppress error when result contains no datapoints [#9636](https://github.com/grafana/grafana/issues/9636), thx [@utkarshcmu](https://github.com/utkarshcmu) * **Singlestat**: suppress error when result contains no datapoints [#9636](https://github.com/grafana/grafana/issues/9636), thx [@utkarshcmu](https://github.com/utkarshcmu)
* **Postgres/MySQL**: Control quoting in SQL-queries when using template variables [#9030](https://github.com/grafana/grafana/issues/9030), thanks [@svenklemm](https://github.com/svenklemm) * **Postgres/MySQL**: Control quoting in SQL-queries when using template variables [#9030](https://github.com/grafana/grafana/issues/9030), thanks [@svenklemm](https://github.com/svenklemm)
* **Pagerduty**: Pagerduty dont auto resolve incidents by default anymore. [#10222](https://github.com/grafana/grafana/issues/10222)
# 4.6.3 (2017-12-14) # 4.6.3 (2017-12-14)

View File

@ -473,7 +473,7 @@ sampler_param = 1
#################################### External Image Storage ############## #################################### External Image Storage ##############
[external_image_storage] [external_image_storage]
# You can choose between (s3, webdav, gcs) # You can choose between (s3, webdav, gcs, azure_blob)
provider = provider =
[external_image_storage.s3] [external_image_storage.s3]
@ -493,4 +493,9 @@ public_url =
[external_image_storage.gcs] [external_image_storage.gcs]
key_file = key_file =
bucket = bucket =
path = path =
[external_image_storage.azure_blob]
account_name =
account_key =
container_name =

View File

@ -417,7 +417,7 @@ log_queries =
#################################### External image storage ########################## #################################### External image storage ##########################
[external_image_storage] [external_image_storage]
# Used for uploading images to public servers so they can be included in slack/email messages. # Used for uploading images to public servers so they can be included in slack/email messages.
# you can choose between (s3, webdav, gcs) # you can choose between (s3, webdav, gcs, azure_blob)
;provider = ;provider =
[external_image_storage.s3] [external_image_storage.s3]
@ -436,4 +436,9 @@ log_queries =
[external_image_storage.gcs] [external_image_storage.gcs]
;key_file = ;key_file =
;bucket = ;bucket =
;path = ;path =
[external_image_storage.azure_blob]
;account_name =
;account_key =
;container_name =

View File

@ -0,0 +1,4 @@
FROM jmferrer/apache2-reverse-proxy:latest
COPY ports.conf /etc/apache2/sites-enabled
COPY proxy.conf /etc/apache2/sites-enabled

View File

@ -0,0 +1,9 @@
# This will proxy all requests for http://localhost:10081/grafana/ to
# http://localhost:3000 (Grafana running locally)
#
# Please note that you'll need to change the root_url in the Grafana configuration:
# root_url = %(protocol)s://%(domain)s:/grafana/
apacheproxy:
build: blocks/apache_proxy
network_mode: host

View File

@ -0,0 +1 @@
Listen 10081

View File

@ -0,0 +1,4 @@
<VirtualHost *:10081>
ProxyPass /grafana/ http://localhost:3000/
ProxyPassReverse /grafana/ http://localhost:3000/
</VirtualHost>

View File

@ -0,0 +1,3 @@
FROM nginx:alpine
COPY nginx.conf /etc/nginx/nginx.conf

View File

@ -0,0 +1,9 @@
# This will proxy all requests for http://localhost:10080/grafana/ to
# http://localhost:3000 (Grafana running locally)
#
# Please note that you'll need to change the root_url in the Grafana configuration:
# root_url = %(protocol)s://%(domain)s:/grafana/
nginxproxy:
build: blocks/nginx_proxy
network_mode: host

View File

@ -0,0 +1,19 @@
events { worker_connections 1024; }
http {
sendfile on;
proxy_redirect off;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $server_name;
server {
listen 10080;
location /grafana/ {
proxy_pass http://localhost:3000/;
}
}
}

View File

@ -10,17 +10,17 @@ weight = 8
# Grafana CLI # Grafana CLI
Grafana cli is a small executable that is bundled with grafana server and is suppose to be executed on the same machine as grafana runs. Grafana cli is a small executable that is bundled with Grafana-server and is supposed to be executed on the same machine Grafana-server is running on.
## Plugins ## Plugins
The CLI helps you install, upgrade and manage your plugins on the same machine it CLI is running. The CLI allows you to install, upgrade and manage your plugins on the machine it is running on.
You can find more information about how to install and manage your plugins at the You can find more information about how to install and manage your plugins in the
[plugin page]({{< relref "plugins/installation.md" >}}). [plugins page]({{< relref "plugins/installation.md" >}}).
## Admin ## Admin
> This feature is only available in grafana 4.1 and above. > This feature is only available in Grafana 4.1 and above.
To show all admin commands: To show all admin commands:
`grafana-cli admin` `grafana-cli admin`
@ -39,7 +39,7 @@ then there are two flags that can be used to set homepath and the config file pa
`grafana-cli admin reset-admin-password --homepath "/usr/share/grafana" newpass` `grafana-cli admin reset-admin-password --homepath "/usr/share/grafana" newpass`
If you have not lost the admin password then it is better to set in the Grafana UI. If you need to set the password in a script then the [Grafana API](http://docs.grafana.org/http_api/user/#change-password) can be used. Here is an example with curl using basic auth: If you have not lost the admin password then it is better to set in the Grafana UI. If you need to set the password in a script then the [Grafana API](http://docs.grafana.org/http_api/user/#change-password) can be used. Here is an example using curl with basic auth:
```bash ```bash
curl -X PUT -H "Content-Type: application/json" -d '{ curl -X PUT -H "Content-Type: application/json" -d '{

View File

@ -10,6 +10,6 @@ weight = 8
# Internal metrics # Internal metrics
Grafana collects some metrics about it self internally. Currently Grafana supports pushing metrics to graphite and exposing them to be scraped by Prometheus. Grafana collects some metrics about itself internally. Currently, Grafana supports pushing metrics to Graphite or exposing them to be scraped by Prometheus.
To enabled internal metrics you have to enable it under the [metrics] section in your [grafana.ini](http://docs.grafana.org/installation/configuration/#enabled-6) config file.If you want to push metrics to graphite you have also have to configure the [metrics.graphite](http://docs.grafana.org/installation/configuration/#metrics-graphite) section. To emit internal metrics you have to enable the option under the [metrics] section in your [grafana.ini](http://docs.grafana.org/installation/configuration/#enabled-6) config file. If you want to push metrics to Graphite, you must also configure the [metrics.graphite](http://docs.grafana.org/installation/configuration/#metrics-graphite) section.

View File

@ -12,7 +12,7 @@ weight = 8
## Config file ## Config file
Checkout the [configuration](/installation/configuration) page for more information about what you can configure in `grafana.ini` Checkout the [configuration](/installation/configuration) page for more information on what you can configure in `grafana.ini`
### Config file locations ### Config file locations
@ -35,7 +35,7 @@ GF_<SectionName>_<KeyName>
``` ```
Where the section name is the text within the brackets. Everything Where the section name is the text within the brackets. Everything
should be upper case, `.` should be replaced by `_`. For example, given these configuration settings: should be upper case and `.` should be replaced by `_`. For example, given these configuration settings:
```bash ```bash
# default section # default section
@ -48,7 +48,7 @@ admin_user = admin
client_secret = 0ldS3cretKey client_secret = 0ldS3cretKey
``` ```
Then you can override them using: Overriding will be done like so:
```bash ```bash
export GF_DEFAULT_INSTANCE_NAME=my-instance export GF_DEFAULT_INSTANCE_NAME=my-instance
@ -60,7 +60,7 @@ export GF_AUTH_GOOGLE_CLIENT_SECRET=newS3cretKey
## Configuration management tools ## Configuration management tools
Currently we do not provide any scripts/manifests for configuring Grafana. Rather then spending time learning and creating scripts/manifests for each tool, we think our time is better spent making Grafana easier to provision. Therefor, we heavily relay on the expertise of he community. Currently we do not provide any scripts/manifests for configuring Grafana. Rather than spending time learning and creating scripts/manifests for each tool, we think our time is better spent making Grafana easier to provision. Therefore, we heavily relay on the expertise of the community.
Tool | Project Tool | Project
-----|------------ -----|------------
@ -70,14 +70,14 @@ Ansible | [https://github.com/picotrading/ansible-grafana](https://github.com/pi
Chef | [https://github.com/JonathanTron/chef-grafana](https://github.com/JonathanTron/chef-grafana) Chef | [https://github.com/JonathanTron/chef-grafana](https://github.com/JonathanTron/chef-grafana)
Saltstack | [https://github.com/salt-formulas/salt-formula-grafana](https://github.com/salt-formulas/salt-formula-grafana) Saltstack | [https://github.com/salt-formulas/salt-formula-grafana](https://github.com/salt-formulas/salt-formula-grafana)
## Datasources ## Datasources
> This feature is available from v5.0 > This feature is available from v5.0
It's possible to manage datasources in Grafana by adding one or more yaml config files in the [`provisioning/datasources`](/installation/configuration/#provisioning) directory. Each config file can contain a list of `datasources` that will be added or updated during start up. If the datasource already exists, Grafana will update it to match the configuration file. The config file can also contain a list of datasources that should be deleted. That list is called `delete_datasources`. Grafana will delete datasources listed in `delete_datasources` before inserting/updating those in the `datasource` list. It's possible to manage datasources in Grafana by adding one or more yaml config files in the [`provisioning/datasources`](/installation/configuration/#provisioning) directory. Each config file can contain a list of `datasources` that will be added or updated during start up. If the datasource already exists, Grafana will update it to match the configuration file. The config file can also contain a list of datasources that should be deleted. That list is called `delete_datasources`. Grafana will delete datasources listed in `delete_datasources` before inserting/updating those in the `datasource` list.
### Running multiple grafana instances. ### Running multiple Grafana instances.
If you are running multiple instances of Grafana you might run into problems if they have different versions of the datasource.yaml configuration file. The best way to solve this problem is to add a version number to each datasource in the configuration and increase it when you update the config. Grafana will only update datasources with the same or lower version number than specified in the config. That way old configs cannot overwrite newer configs if they restart at the same time. If you are running multiple instances of Grafana you might run into problems if they have different versions of the `datasource.yaml` configuration file. The best way to solve this problem is to add a version number to each datasource in the configuration and increase it when you update the config. Grafana will only update datasources with the same or lower version number than specified in the config. That way, old configs cannot overwrite newer configs if they restart at the same time.
### Example datasource config file ### Example datasource config file
```yaml ```yaml
@ -86,7 +86,7 @@ delete_datasources:
- name: Graphite - name: Graphite
org_id: 1 org_id: 1
# list of datasources to insert/update depending # list of datasources to insert/update depending
# whats available in the datbase # whats available in the datbase
datasources: datasources:
# <string, required> name of the datasource. Required # <string, required> name of the datasource. Required
@ -116,7 +116,7 @@ datasources:
# <bool> mark as default datasource. Max one per org # <bool> mark as default datasource. Max one per org
is_default: is_default:
# <map> fields that will be converted to json and stored in json_data # <map> fields that will be converted to json and stored in json_data
json_data: json_data:
graphiteVersion: "1.1" graphiteVersion: "1.1"
tlsAuth: true tlsAuth: true
tlsAuthWithCACert: true tlsAuthWithCACert: true
@ -132,45 +132,46 @@ datasources:
#### Json data #### Json data
Since all datasources dont have the same configuration settings we only have the most common ones as fields. The rest should be stored as a json blob in the `json_data` field. Here are the most common settings that the core datasources use. Since not all datasources have the same configuration settings we only have the most common ones as fields. The rest should be stored as a json blob in the `json_data` field. Here are the most common settings that the core datasources use.
| Name | Type | Datasource |Description | | Name | Type | Datasource |Description |
| ----| ---- | ---- | --- | | ----| ---- | ---- | --- |
| tlsAuth | boolean | *All* | Enable TLS authentication using client cert configured in secure json data | | tlsAuth | boolean | *All* | Enable TLS authentication using client cert configured in secure json data |
| tlsAuthWithCACert | boolean | *All* | Enable TLS authtication using CA cert | | tlsAuthWithCACert | boolean | *All* | Enable TLS authtication using CA cert |
| tlsSkipVerify | boolean | *All* | Controls whether a client verifies the server's certificate chain and host name. |
| graphiteVersion | string | Graphite | Graphite version | | graphiteVersion | string | Graphite | Graphite version |
| timeInterval | string | Elastic, Influxdb & Prometheus | Lowest interval/step value that should be used for this data source | | timeInterval | string | Elastic, Influxdb & Prometheus | Lowest interval/step value that should be used for this data source |
| esVersion | string | Elastic | Elasticsearch version | | esVersion | string | Elastic | Elasticsearch version |
| timeField | string | Elastic | Which field that should be used as timestamp | | timeField | string | Elastic | Which field that should be used as timestamp |
| interval | string | Elastic | Index date time format | | interval | string | Elastic | Index date time format |
| authType | string | Cloudwatch | Auth provider. keys/credentials/arn | | authType | string | Cloudwatch | Auth provider. keys/credentials/arn |
| assumeRoleArn | string | Cloudwatch | ARN of Assume Role | | assumeRoleArn | string | Cloudwatch | ARN of Assume Role |
| defaultRegion | string | Cloudwatch | AWS region | | defaultRegion | string | Cloudwatch | AWS region |
| customMetricsNamespaces | string | Cloudwatch | Namespaces of Custom Metrics | | customMetricsNamespaces | string | Cloudwatch | Namespaces of Custom Metrics |
| tsdbVersion | string | OpenTsdb | Version | | tsdbVersion | string | OpenTsdb | Version |
| tsdbResolution | string | OpenTsdb | Resolution | | tsdbResolution | string | OpenTsdb | Resolution |
| sslmode | string | Postgre | SSLmode. 'disable', 'require', 'verify-ca' or 'verify-full' | | sslmode | string | Postgre | SSLmode. 'disable', 'require', 'verify-ca' or 'verify-full' |
#### Secure Json data #### Secure Json data
{"authType":"keys","defaultRegion":"us-west-2","timeField":"@timestamp"} {"authType":"keys","defaultRegion":"us-west-2","timeField":"@timestamp"}
Secure json data is a map of settings that will be encrypted with [secret key](/installation/configuration/#secret-key) from the grafana config. The purpose of this is only to hide content from the users of the application. This should be used for storing TLS Cert and password that Grafana will append to request on the server side. All these settings are optional. Secure json data is a map of settings that will be encrypted with [secret key](/installation/configuration/#secret-key) from the Grafana config. The purpose of this is only to hide content from the users of the application. This should be used for storing TLS Cert and password that Grafana will append to the request on the server side. All of these settings are optional.
| Name | Type | Datasource | Description | | Name | Type | Datasource | Description |
| ----| ---- | ---- | --- | | ----| ---- | ---- | --- |
| tlsCACert | string | *All* |CA cert for out going requests | | tlsCACert | string | *All* |CA cert for out going requests |
| tlsClientCert | string | *All* |TLS Client cert for outgoing requests | | tlsClientCert | string | *All* |TLS Client cert for outgoing requests |
| tlsClientKey | string | *All* |TLS Client key for outgoing requests | | tlsClientKey | string | *All* |TLS Client key for outgoing requests |
| password | string | Postgre | password | | password | string | Postgre | password |
| user | string | Postgre | user | | user | string | Postgre | user |
### Dashboards ### Dashboards
It's possible to manage dashboards in Grafana by adding one or more yaml config files in the [`provisioning/dashboards`](/installation/configuration/#provisioning) directory. Each config file can contain a list of `dashboards providers` that will load dashboards into grafana. Currently we only support reading dashboards from file but we will add more providers in the future. It's possible to manage dashboards in Grafana by adding one or more yaml config files in the [`provisioning/dashboards`](/installation/configuration/#provisioning) directory. Each config file can contain a list of `dashboards providers` that will load dashboards into Grafana. Currently we only support reading dashboards from file but we will add more providers in the future.
The dashboard provider config file looks like this The dashboard provider config file looks somewhat like this:
```yaml ```yaml
- name: 'default' - name: 'default'
@ -181,4 +182,4 @@ The dashboard provider config file looks like this
folder: /var/lib/grafana/dashboards folder: /var/lib/grafana/dashboards
``` ```
When grafana starts it will update/insert all dashboards available in the configured folders. If you modify the file the dashboard will also be updated. When Grafana starts, it will update/insert all dashboards available in the configured folders. If you modify the file, the dashboard will also be updated.

View File

@ -13,7 +13,7 @@ weight = 2
> Alerting is only available in Grafana v4.0 and above. > Alerting is only available in Grafana v4.0 and above.
The alert engine publishes some internal metrics about itself. You can read more about how Grafana published [internal metrics](/installation/configuration/#metrics). The alert engine publishes some internal metrics about itself. You can read more about how Grafana publishes [internal metrics](/installation/configuration/#metrics).
Description | Type | Metric name Description | Type | Metric name
---------- | ----------- | ---------- ---------- | ----------- | ----------

View File

@ -14,9 +14,9 @@ weight = 2
> Alerting is only available in Grafana v4.0 and above. > Alerting is only available in Grafana v4.0 and above.
When an alert changes state it sends out notifications. Each alert rule can have When an alert changes state, it sends out notifications. Each alert rule can have
multiple notifications. But in order to add a notification to an alert rule you first need multiple notifications. In order to add a notification to an alert rule you first need
to add and configure a `notification` channel (can be email, Pagerduty or other integration). This is done from the Notification Channels page. to add and configure a `notification` channel (can be email, PagerDuty or other integration). This is done from the Notification Channels page.
## Notification Channel Setup ## Notification Channel Setup
@ -25,12 +25,12 @@ to add and configure a `notification` channel (can be email, Pagerduty or other
On the Notification Channels page hit the `New Channel` button to go the page where you On the Notification Channels page hit the `New Channel` button to go the page where you
can configure and setup a new Notification Channel. can configure and setup a new Notification Channel.
You specify name and type, and type specific options. You can also test the notification to make You specify a name and a type, and type specific options. You can also test the notification to make
sure it's working and setup correctly. sure it's setup correctly.
### Send on all alerts ### Send on all alerts
When checked this option will make this notification used for all alert rules, existing and new. When checked, this option will nofity for all alert rules - existing and new.
## Supported Notification Types ## Supported Notification Types
@ -38,39 +38,39 @@ Grafana ships with the following set of notification types:
### Email ### Email
To enable email notification you have to setup [SMTP settings](/installation/configuration/#smtp) To enable email notifications you have to setup [SMTP settings](/installation/configuration/#smtp)
in the Grafana config. Email notification will upload an image of the alert graph to an in the Grafana config. Email notifications will upload an image of the alert graph to an
external image destination if available or fallback to attaching the image in the email. external image destination if available or fallback to attaching the image to the email.
### Slack ### Slack
{{< imgbox max-width="40%" img="/img/docs/v4/slack_notification.png" caption="Alerting Slack Notification" >}} {{< imgbox max-width="40%" img="/img/docs/v4/slack_notification.png" caption="Alerting Slack Notification" >}}
To set up slack you need to configure an incoming webhook url at slack. You can follow their guide for how To set up slack you need to configure an incoming webhook url at slack. You can follow their guide on how
to do that https://api.slack.com/incoming-webhooks If you want to include screenshots of the firing alerts to do that [here](https://api.slack.com/incoming-webhooks). If you want to include screenshots of the firing alerts
in the slack messages you have to configure either the [external image destination](#external-image-store) in Grafana, in the Slack messages you have to configure either the [external image destination](#external-image-store) in Grafana,
or a bot integration via Slack Apps. Follow Slack's guide to set up a bot integration and use the token provided or a bot integration via Slack Apps. Follow Slack's guide to set up a bot integration and use the token provided
https://api.slack.com/bot-users, which starts with "xoxb". (https://api.slack.com/bot-users), which starts with "xoxb".
Setting | Description Setting | Description
---------- | ----------- ---------- | -----------
Recipient | allows you to override the slack recipient. Recipient | allows you to override the Slack recipient.
Mention | make it possible to include a mention in the slack notification sent by Grafana. Ex @here or @channel Mention | make it possible to include a mention in the Slack notification sent by Grafana. Ex @here or @channel
Token | If provided, Grafana will upload the generated image via Slack's file.upload API method, not the external image destination. Token | If provided, Grafana will upload the generated image via Slack's file.upload API method, not the external image destination.
### PagerDuty ### PagerDuty
To set up PagerDuty, all you have to do is to provide an api key. To set up PagerDuty, all you have to do is to provide an API key.
Setting | Description Setting | Description
---------- | ----------- ---------- | -----------
Integration Key | Integration key for pagerduty. Integration Key | Integration key for PagerDuty.
Auto resolve incidents | Resolve incidents in pagerduty once the alert goes back to ok Auto resolve incidents | Resolve incidents in PagerDuty once the alert goes back to ok
### Webhook ### Webhook
The webhook notification is a simple way to send information about an state change over HTTP to a custom endpoint. The webhook notification is a simple way to send information about a state change over HTTP to a custom endpoint.
Using this notification you could integrate Grafana into any system you choose, by yourself. Using this notification you could integrate Grafana into a system of your choosing.
Example json body: Example json body:
@ -117,19 +117,19 @@ Dingtalk supports the following "message type": `text`, `link` and `markdown`. O
### Kafka ### Kafka
Notifications can be sent to a Kafka topic from Grafana using [Kafka REST Proxy](https://docs.confluent.io/1.0/kafka-rest/docs/index.html). Notifications can be sent to a Kafka topic from Grafana using the [Kafka REST Proxy](https://docs.confluent.io/1.0/kafka-rest/docs/index.html).
There are couple of configurations options which need to be set in Grafana UI under Kafka Settings: There are a couple of configuration options which need to be set up in Grafana UI under Kafka Settings:
1. Kafka REST Proxy endpoint. 1. Kafka REST Proxy endpoint.
2. Kafka Topic. 2. Kafka Topic.
Once these two properties are set, you can send the alerts to Kafka for further processing or throttling them. Once these two properties are set, you can send the alerts to Kafka for further processing or throttling.
### All supported notifier ### All supported notifier
Name | Type |Support images Name | Type |Support images
-----|------------ | ------ -----|------------ | ------
Slack | `slack` | yes Slack | `slack` | yes
Pagerduty | `pagerduty` | yes Pagerduty | `pagerduty` | yes
Email | `email` | yes Email | `email` | yes
@ -150,13 +150,13 @@ Prometheus Alertmanager | `prometheus-alertmanager` | no
# Enable images in notifications {#external-image-store} # Enable images in notifications {#external-image-store}
Grafana can render the panel associated with the alert rule and include that in the notification. Most Notification Channels require that this image be publicly accessible (Slack and PagerDuty for example). In order to include images in alert notifications, Grafana can upload the image to an image store. It currently supports Grafana can render the panel associated with the alert rule and include that in the notification. Most Notification Channels require that this image be publicly accessible (Slack and PagerDuty for example). In order to include images in alert notifications, Grafana can upload the image to an image store. It currently supports
Amazon S3 and Webdav for this. So to set that up you need to configure the [external image uploader](/installation/configuration/#external-image-storage) in your grafana-server ini config file. Amazon S3, Webdav, and Azure Blob Storage for this. So to set that up you need to configure the [external image uploader](/installation/configuration/#external-image-storage) in your grafana-server ini config file.
Currently only the Email Channels attaches images if no external image store is specified. To include images in alert notifications for other channels then you need to set up an external image store. Currently only the Email Channels attaches images if no external image store is specified. To include images in alert notifications for other channels then you need to set up an external image store.
This is an optional requirement, you can get Slack and email notifications without setting this up. This is an optional requirement. You can get Slack and email notifications without setting this up.
# Configure the link back to Grafana from alert notifications # Configure the link back to Grafana from alert notifications
All alert notifications contains a link back to the triggered alert in the Grafana instance. All alert notifications contain a link back to the triggered alert in the Grafana instance.
This url is based on the [domain](/installation/configuration/#domain) setting in Grafana. This url is based on the [domain](/installation/configuration/#domain) setting in Grafana.

View File

@ -55,7 +55,7 @@ of another alert in your conditions, and `Time Of Day`.
Alerting would not be very useful if there was no way to send notifications when rules trigger and change state. You Alerting would not be very useful if there was no way to send notifications when rules trigger and change state. You
can setup notifications of different types. We currently have `Slack`, `PagerDuty`, `Email` and `Webhook` with more in the can setup notifications of different types. We currently have `Slack`, `PagerDuty`, `Email` and `Webhook` with more in the
pipe that will be added during beta period. The notifications can then be added to your alert rules. pipe that will be added during beta period. The notifications can then be added to your alert rules.
If you have configured an external image store in the grafana.ini config file (s3 and webdav options available) If you have configured an external image store in the grafana.ini config file (s3, webdav, and azure_blob options available)
you can get very rich notifications with an image of the graph and the metric you can get very rich notifications with an image of the graph and the metric
values all included in the notification. values all included in the notification.

View File

@ -68,5 +68,38 @@ server {
} }
} }
``` ```
### IIS URL Rewrite Rule (Windows) with Subpath
IIS requires that the URL Rewrite module is installed.
Given:
- subpath `grafana`
- Grafana installed on `http://localhost:3000`
- server config:
```bash
[server]
domain = localhost:8080
root_url = %(protocol)s://%(domain)s:/grafana
```
Create an Inbound Rule for the parent website (localhost:8080 in this example) with the following settings:
- pattern: `grafana(/)?(.*)`
- check the `Ignore case` checkbox
- rewrite url set to `http://localhost:3000/{R:2}`
- check the `Append query string` checkbox
- check the `Stop processing of subsequent rules` checkbox
The rewrite rule that is generated for the web.config:
```xml
<rewrite>
<rules>
<rule name="Grafana" enabled="true" stopProcessing="true">
<match url="grafana(/)?(.*)" />
<action type="Rewrite" url="http://localhost:3000/{R:2}" logRewrittenUrl="false" />
</rule>
</rules>
</rewrite>
```

View File

@ -496,7 +496,7 @@ name = BitBucket
enabled = true enabled = true
allow_sign_up = true allow_sign_up = true
client_id = <client id> client_id = <client id>
client_secret = <secret> client_secret = <client secret>
scopes = account email scopes = account email
auth_url = https://bitbucket.org/site/oauth2/authorize auth_url = https://bitbucket.org/site/oauth2/authorize
token_url = https://bitbucket.org/site/oauth2/access_token token_url = https://bitbucket.org/site/oauth2/access_token
@ -505,6 +505,41 @@ team_ids =
allowed_organizations = allowed_organizations =
``` ```
### Set up oauth2 with OneLogin
1. Create a new Custom Connector with the following settings:
- Name: Grafana
- Sign On Method: OpenID Connect
- Redirect URI: `https://<grafana domain>/login/generic_oauth`
- Signing Algorithm: RS256
- Login URL: `https://<grafana domain>/login/generic_oauth`
then:
2. Add an App to the Grafana Connector:
- Display Name: Grafana
then:
3. Under the SSO tab on the Grafana App details page you'll find the Client ID and Client Secret.
Your OneLogin Domain will match the url you use to access OneLogin.
Configure Grafana as follows:
```bash
[auth.generic_oauth]
name = OneLogin
enabled = true
allow_sign_up = true
client_id = <client id>
client_secret = <client secret>
scopes = openid email name
auth_url = https://<onelogin domain>.onelogin.com/oidc/auth
token_url = https://<onelogin domain>.onelogin.com/oidc/token
api_url = https://<onelogin domain>.onelogin.com/oidc/me
team_ids =
allowed_organizations =
```
<hr> <hr>
## [auth.basic] ## [auth.basic]
@ -731,7 +766,7 @@ Time to live for snapshots.
These options control how images should be made public so they can be shared on services like slack. These options control how images should be made public so they can be shared on services like slack.
### provider ### provider
You can choose between (s3, webdav, gcs). If left empty Grafana will ignore the upload action. You can choose between (s3, webdav, gcs, azure_blob). If left empty Grafana will ignore the upload action.
## [external_image_storage.s3] ## [external_image_storage.s3]
@ -786,6 +821,17 @@ Bucket Name on Google Cloud Storage.
### path ### path
Optional extra path inside bucket Optional extra path inside bucket
## [external_image_storage.azure_blob]
### account_name
Storage account name
### account_key
Storage account key
### container_name
Container name where to store "Blob" images with random names. Creating the blob container beforehand is required. Only public containers are supported.
## [alerting] ## [alerting]
### enabled ### enabled

View File

@ -71,8 +71,8 @@ Each field in the dashboard JSON is explained below with its usage:
| **timepicker** | timepicker metadata, see [timepicker section](#timepicker) for details | | **timepicker** | timepicker metadata, see [timepicker section](#timepicker) for details |
| **templating** | templating metadata, see [templating section](#templating) for details | | **templating** | templating metadata, see [templating section](#templating) for details |
| **annotations** | annotations metadata, see [annotations section](#annotations) for details | | **annotations** | annotations metadata, see [annotations section](#annotations) for details |
| **schemaVersion** | TODO | | **schemaVersion** | version of the JSON schema (integer), incremented each time a Grafana update brings changes to the said schema |
| **version** | TODO | | **version** | version of the dashboard (integer), incremented each time the dashboard is updated |
| **links** | TODO | | **links** | TODO |
### rows ### rows

View File

@ -0,0 +1,89 @@
+++
title = "Grafana with IIS Reverse Proxy on Windows"
type = "docs"
keywords = ["grafana", "tutorials", "proxy", "IIS", "windows"]
[menu.docs]
parent = "tutorials"
weight = 10
+++
# How to Use IIS with URL Rewrite as a Reverse Proxy for Grafana on Windows
If you want Grafana to be a subdomain under a website in IIS then the URL Rewrite module for ISS can be used to support this.
Example:
- Parent site: http://localhost:8080
- Grafana: http://localhost:3000
Grafana as a subdomain: http://localhost:8080/grafana
## Setup
If you have not already done it, then a requirement is to install URL Rewrite module for IIS.
Download and install the URL Rewrite module for IIS: https://www.iis.net/downloads/microsoft/url-rewrite
## Grafana Config
The Grafana config can be set by creating a file named `custom.ini` in the `conf` subdirectory of your Grafana installation. See the [installation instructions](http://docs.grafana.org/installation/windows/#configure) for more details.
Given that the subpath should be `grafana` and the parent site is `localhost:8080` then add this to the `custom.ini` config file:
```bash
[server]
domain = localhost:8080
root_url = %(protocol)s://%(domain)s:/grafana
```
Restart the Grafana server after changing the config file.
## IIS Config
1. Open the IIS Manager and click on the parent website
2. In the admin console for this website, double click on the Url Rewrite option:
{{< docs-imagebox img="/img/docs/tutorials/IIS_admin_console.png" max-width= "800px" >}}
3. Click on the `Add Rule(s)...` action
4. Choose the Blank Rule template for an Inbound Rule
{{< docs-imagebox img="/img/docs/tutorials/IIS_add_inbound_rule.png" max-width= "800px" >}}
5. Create an Inbound Rule for the parent website (localhost:8080 in this example) with the following settings:
- pattern: `grafana(/)?(.*)`
- check the `Ignore case` checkbox
- rewrite url set to `http://localhost:3000/{R:2}`
- check the `Append query string` checkbox
- check the `Stop processing of subsequent rules` checkbox
{{< docs-imagebox img="/img/docs/tutorials/IIS_url_rewrite.png" max-width= "800px" >}}
Finally, navigate to `http://localhost:8080/grafana` (replace `http://localhost:8080` with your parent domain) and you should come to the Grafana login page.
## Troubleshooting
### 404 error
When navigating to the grafana url (`http://localhost:8080/grafana` in the example above) and a `HTTP Error 404.0 - Not Found` error is returned then either:
- the pattern for the Inbound Rule is incorrect. Edit the rule, click on the `Test pattern...` button, test the part of the url after `http://localhost:8080/` and make sure it matches. For `grafana/login` the test should return 3 capture groups: {R:0}: `grafana` {R:1}: `/` and {R:2}: `login`.
- The `root_url` setting in the Grafana config file does not match the parent url with subpath.
### Grafana Website only shows text with no images or css
{{< docs-imagebox img="/img/docs/tutorials/IIS_proxy_error.png" max-width= "800px" >}}
1. The `root_url` setting in the Grafana config file does not match the parent url with subpath. This could happen if the root_url is commented out by mistake (`;` is used for commenting out a line in .ini files):
`; root_url = %(protocol)s://%(domain)s:/grafana`
2. or if the subpath in the `root_url` setting does not match the subpath used in the pattern in the Inbound Rule in IIS:
`root_url = %(protocol)s://%(domain)s:/grafana`
pattern in Inbound Rule: `wrongsubpath(/)?(.*)`
3. or if the Rewrite Url in the Inbound Rule is incorrect.
The Rewrite Url should not include the subpath.
The Rewrite Url should contain the capture group from the pattern matching that returns the part of the url after the subpath. The pattern used above returns 3 capture groups and the third one {R:2} returns the part of the url after `http://localhost:8080/grafana/`.

View File

@ -24,5 +24,6 @@ module.exports = {
"setupFiles": [ "setupFiles": [
"./public/test/jest-shim.ts", "./public/test/jest-shim.ts",
"./public/test/jest-setup.ts" "./public/test/jest-setup.ts"
] ],
"snapshotSerializers": ["enzyme-to-json/serializer"],
}; };

View File

@ -25,6 +25,7 @@
"css-loader": "^0.28.7", "css-loader": "^0.28.7",
"enzyme": "^3.1.0", "enzyme": "^3.1.0",
"enzyme-adapter-react-16": "^1.0.1", "enzyme-adapter-react-16": "^1.0.1",
"enzyme-to-json": "^3.3.0",
"es6-promise": "^3.0.2", "es6-promise": "^3.0.2",
"es6-shim": "^0.35.3", "es6-shim": "^0.35.3",
"expect.js": "~0.2.0", "expect.js": "~0.2.0",
@ -54,7 +55,7 @@
"html-loader": "^0.5.1", "html-loader": "^0.5.1",
"html-webpack-plugin": "^2.30.1", "html-webpack-plugin": "^2.30.1",
"husky": "^0.14.3", "husky": "^0.14.3",
"jest": "^21.2.1", "jest": "^22.0.4",
"jshint-stylish": "~2.2.1", "jshint-stylish": "~2.2.1",
"json-loader": "^0.5.7", "json-loader": "^0.5.7",
"karma": "1.7.0", "karma": "1.7.0",
@ -83,14 +84,15 @@
"sinon": "1.17.6", "sinon": "1.17.6",
"systemjs": "0.20.19", "systemjs": "0.20.19",
"systemjs-plugin-css": "^0.1.36", "systemjs-plugin-css": "^0.1.36",
"ts-jest": "^21.1.3", "ts-jest": "^22.0.0",
"ts-loader": "^2.3.7", "ts-loader": "^3.2.0",
"tslint": "^5.7.0", "tslint": "^5.8.0",
"tslint-loader": "^3.5.3", "tslint-loader": "^3.5.3",
"typescript": "^2.5.2", "typescript": "^2.6.2",
"webpack": "^3.6.0", "webpack": "^3.10.0",
"webpack-bundle-analyzer": "^2.9.0", "webpack-bundle-analyzer": "^2.9.0",
"webpack-cleanup-plugin": "^0.5.1", "webpack-cleanup-plugin": "^0.5.1",
"angular-mocks": "^1.6.6",
"webpack-merge": "^4.1.0", "webpack-merge": "^4.1.0",
"zone.js": "^0.7.2" "zone.js": "^0.7.2"
}, },
@ -107,19 +109,23 @@
}, },
"lint-staged": { "lint-staged": {
"*.{ts,tsx}": [ "*.{ts,tsx}": [
"prettier --single-quote --trailing-comma es5 --write", "prettier --write",
"git add" "git add"
], ],
"*.scss": [ "*.scss": [
"prettier --single-quote --write", "prettier --write",
"git add" "git add"
] ]
}, },
"prettier": {
"trailingComma": "es5",
"singleQuote": true,
"printWidth": 120
},
"license": "Apache-2.0", "license": "Apache-2.0",
"dependencies": { "dependencies": {
"angular": "^1.6.6", "angular": "^1.6.6",
"angular-bindonce": "^0.3.1", "angular-bindonce": "^0.3.1",
"angular-mocks": "^1.6.6",
"angular-native-dragdrop": "^1.2.2", "angular-native-dragdrop": "^1.2.2",
"angular-route": "^1.6.6", "angular-route": "^1.6.6",
"angular-sanitize": "^1.6.6", "angular-sanitize": "^1.6.6",
@ -133,13 +139,19 @@
"file-saver": "^1.3.3", "file-saver": "^1.3.3",
"jquery": "^3.2.1", "jquery": "^3.2.1",
"lodash": "^4.17.4", "lodash": "^4.17.4",
"mobx": "^3.4.1",
"mobx-react": "^4.3.5",
"mobx-state-tree": "^1.3.1",
"moment": "^2.18.1", "moment": "^2.18.1",
"mousetrap": "^1.6.0", "mousetrap": "^1.6.0",
"perfect-scrollbar": "^1.2.0", "perfect-scrollbar": "^1.2.0",
"prop-types": "^15.6.0", "prop-types": "^15.6.0",
"react": "^16.1.1", "react": "^16.2.0",
"react-dom": "^16.1.1", "react-dom": "^16.2.0",
"react-grid-layout": "^0.16.1", "react-grid-layout": "^0.16.1",
"react-popper": "^0.7.5",
"react-highlight-words": "^0.10.0",
"react-select": "^1.1.0",
"react-sizeme": "^2.3.6", "react-sizeme": "^2.3.6",
"remarkable": "^1.7.1", "remarkable": "^1.7.1",
"rxjs": "^5.4.3", "rxjs": "^5.4.3",

View File

@ -278,7 +278,7 @@ func PauseAlert(c *middleware.Context, dto dtos.PauseAlertCommand) Response {
} }
var response models.AlertStateType = models.AlertStatePending var response models.AlertStateType = models.AlertStatePending
pausedState := "un paused" pausedState := "un-paused"
if cmd.Paused { if cmd.Paused {
response = models.AlertStatePaused response = models.AlertStatePaused
pausedState = "paused" pausedState = "paused"
@ -287,7 +287,7 @@ func PauseAlert(c *middleware.Context, dto dtos.PauseAlertCommand) Response {
result := map[string]interface{}{ result := map[string]interface{}{
"alertId": alertId, "alertId": alertId,
"state": response, "state": response,
"message": "alert " + pausedState, "message": "Alert " + pausedState,
} }
return Json(200, result) return Json(200, result)

View File

@ -40,8 +40,11 @@ func (hs *HttpServer) registerRoutes() {
r.Get("/datasources/", reqSignedIn, Index) r.Get("/datasources/", reqSignedIn, Index)
r.Get("/datasources/new", reqSignedIn, Index) r.Get("/datasources/new", reqSignedIn, Index)
r.Get("/datasources/edit/*", reqSignedIn, Index) r.Get("/datasources/edit/*", reqSignedIn, Index)
r.Get("/org/users", reqSignedIn, Index)
r.Get("/org/users/new", reqSignedIn, Index) r.Get("/org/users/new", reqSignedIn, Index)
r.Get("/org/users/invite", reqSignedIn, Index) r.Get("/org/users/invite", reqSignedIn, Index)
r.Get("/org/teams", reqSignedIn, Index)
r.Get("/org/teams/*", reqSignedIn, Index)
r.Get("/org/apikeys/", reqSignedIn, Index) r.Get("/org/apikeys/", reqSignedIn, Index)
r.Get("/dashboard/import/", reqSignedIn, Index) r.Get("/dashboard/import/", reqSignedIn, Index)
r.Get("/configuration", reqGrafanaAdmin, Index) r.Get("/configuration", reqGrafanaAdmin, Index)

View File

@ -157,11 +157,11 @@ func NewCacheServer() *CacheServer {
func newNotFound() *Avatar { func newNotFound() *Avatar {
avatar := &Avatar{notFound: true} avatar := &Avatar{notFound: true}
// load transparent png into buffer // load user_profile png into buffer
path := filepath.Join(setting.StaticRootPath, "img", "transparent.png") path := filepath.Join(setting.StaticRootPath, "img", "user_profile.png")
if data, err := ioutil.ReadFile(path); err != nil { if data, err := ioutil.ReadFile(path); err != nil {
log.Error(3, "Failed to read transparent.png, %v", path) log.Error(3, "Failed to read user_profile.png, %v", path)
} else { } else {
avatar.data = bytes.NewBuffer(data) avatar.data = bytes.NewBuffer(data)
} }

View File

@ -3,6 +3,7 @@ package dtos
import ( import (
"crypto/md5" "crypto/md5"
"fmt" "fmt"
"regexp"
"strings" "strings"
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
@ -57,3 +58,19 @@ func GetGravatarUrl(text string) string {
hasher.Write([]byte(strings.ToLower(text))) hasher.Write([]byte(strings.ToLower(text)))
return fmt.Sprintf(setting.AppSubUrl+"/avatar/%x", hasher.Sum(nil)) return fmt.Sprintf(setting.AppSubUrl+"/avatar/%x", hasher.Sum(nil))
} }
func GetGravatarUrlWithDefault(text string, defaultText string) string {
if text != "" {
return GetGravatarUrl(text)
}
reg, err := regexp.Compile("[^a-zA-Z0-9]+")
if err != nil {
return ""
}
text = reg.ReplaceAllString(defaultText, "") + "@localhost"
return GetGravatarUrl(text)
}

View File

@ -35,7 +35,7 @@ var (
func GenStateString() string { func GenStateString() string {
rnd := make([]byte, 32) rnd := make([]byte, 32)
rand.Read(rnd) rand.Read(rnd)
return base64.StdEncoding.EncodeToString(rnd) return base64.URLEncoding.EncodeToString(rnd)
} }
func OAuthLogin(ctx *middleware.Context) { func OAuthLogin(ctx *middleware.Context) {

View File

@ -31,11 +31,12 @@ func RenderToPng(c *middleware.Context) {
pngPath, err := renderer.RenderToPng(renderOpts) pngPath, err := renderer.RenderToPng(renderOpts)
if err != nil { if err != nil && err == renderer.ErrTimeout {
if err == renderer.ErrTimeout { c.Handle(500, err.Error(), err)
c.Handle(500, err.Error(), err) return
} }
if err != nil {
c.Handle(500, "Rendering failed.", err) c.Handle(500, "Rendering failed.", err)
return return
} }

View File

@ -1,6 +1,7 @@
package api package api
import ( import (
"github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/bus" "github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/middleware" "github.com/grafana/grafana/pkg/middleware"
m "github.com/grafana/grafana/pkg/models" m "github.com/grafana/grafana/pkg/models"
@ -70,6 +71,10 @@ func SearchTeams(c *middleware.Context) Response {
return ApiError(500, "Failed to search Teams", err) return ApiError(500, "Failed to search Teams", err)
} }
for _, team := range query.Result.Teams {
team.AvatarUrl = dtos.GetGravatarUrlWithDefault(team.Email, team.Name)
}
query.Result.Page = page query.Result.Page = page
query.Result.PerPage = perPage query.Result.PerPage = perPage

View File

@ -1,6 +1,7 @@
package api package api
import ( import (
"github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/bus" "github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/middleware" "github.com/grafana/grafana/pkg/middleware"
m "github.com/grafana/grafana/pkg/models" m "github.com/grafana/grafana/pkg/models"
@ -15,6 +16,10 @@ func GetTeamMembers(c *middleware.Context) Response {
return ApiError(500, "Failed to get Team Members", err) return ApiError(500, "Failed to get Team Members", err)
} }
for _, member := range query.Result {
member.AvatarUrl = dtos.GetGravatarUrl(member.Email)
}
return Json(200, query.Result) return Json(200, query.Result)
} }

View File

@ -0,0 +1,320 @@
package imguploader
import (
"bytes"
"context"
"crypto/hmac"
"crypto/sha256"
"encoding/base64"
"encoding/xml"
"fmt"
"io"
"io/ioutil"
"mime"
"net/http"
"net/url"
"os"
"path"
"sort"
"strconv"
"strings"
"time"
"github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/util"
)
type AzureBlobUploader struct {
account_name string
account_key string
container_name string
log log.Logger
}
func NewAzureBlobUploader(account_name string, account_key string, container_name string) *AzureBlobUploader {
return &AzureBlobUploader{
account_name: account_name,
account_key: account_key,
container_name: container_name,
log: log.New("azureBlobUploader"),
}
}
// Receive path of image on disk and return azure blob url
func (az *AzureBlobUploader) Upload(ctx context.Context, imageDiskPath string) (string, error) {
// setup client
blob := NewStorageClient(az.account_name, az.account_key)
file, err := os.Open(imageDiskPath)
if err != nil {
return "", err
}
randomFileName := util.GetRandomString(30) + ".png"
// upload image
az.log.Debug("Uploading image to azure_blob", "conatiner_name", az.container_name, "blob_name", randomFileName)
resp, err := blob.FileUpload(az.container_name, randomFileName, file)
if err != nil {
return "", err
}
if resp.StatusCode > 400 && resp.StatusCode < 600 {
body, _ := ioutil.ReadAll(io.LimitReader(resp.Body, 1<<20))
aerr := &Error{
Code: resp.StatusCode,
Status: resp.Status,
Body: body,
Header: resp.Header,
}
aerr.parseXML()
resp.Body.Close()
return "", aerr
}
if err != nil {
return "", err
}
url := fmt.Sprintf("https://%s.blob.core.windows.net/%s/%s", az.account_name, az.container_name, randomFileName)
return url, nil
}
// --- AZURE LIBRARY
type Blobs struct {
XMLName xml.Name `xml:"EnumerationResults"`
Items []Blob `xml:"Blobs>Blob"`
}
type Blob struct {
Name string `xml:"Name"`
Property Property `xml:"Properties"`
}
type Property struct {
LastModified string `xml:"Last-Modified"`
Etag string `xml:"Etag"`
ContentLength int `xml:"Content-Length"`
ContentType string `xml:"Content-Type"`
BlobType string `xml:"BlobType"`
LeaseStatus string `xml:"LeaseStatus"`
}
type Error struct {
Code int
Status string
Body []byte
Header http.Header
AzureCode string
}
func (e *Error) Error() string {
return fmt.Sprintf("status %d: %s", e.Code, e.Body)
}
func (e *Error) parseXML() {
var xe xmlError
_ = xml.NewDecoder(bytes.NewReader(e.Body)).Decode(&xe)
e.AzureCode = xe.Code
}
type xmlError struct {
XMLName xml.Name `xml:"Error"`
Code string
Message string
}
const ms_date_layout = "Mon, 02 Jan 2006 15:04:05 GMT"
const version = "2017-04-17"
var client = &http.Client{}
type StorageClient struct {
Auth *Auth
Transport http.RoundTripper
}
func (c *StorageClient) transport() http.RoundTripper {
if c.Transport != nil {
return c.Transport
}
return http.DefaultTransport
}
func NewStorageClient(account, accessKey string) *StorageClient {
return &StorageClient{
Auth: &Auth{
account,
accessKey,
},
Transport: nil,
}
}
func (c *StorageClient) absUrl(format string, a ...interface{}) string {
part := fmt.Sprintf(format, a...)
return fmt.Sprintf("https://%s.blob.core.windows.net/%s", c.Auth.Account, part)
}
func copyHeadersToRequest(req *http.Request, headers map[string]string) {
for k, v := range headers {
req.Header[k] = []string{v}
}
}
func (c *StorageClient) FileUpload(container, blobName string, body io.Reader) (*http.Response, error) {
blobName = escape(blobName)
extension := strings.ToLower(path.Ext(blobName))
contentType := mime.TypeByExtension(extension)
buf := new(bytes.Buffer)
buf.ReadFrom(body)
req, err := http.NewRequest(
"PUT",
c.absUrl("%s/%s", container, blobName),
buf,
)
if err != nil {
return nil, err
}
copyHeadersToRequest(req, map[string]string{
"x-ms-blob-type": "BlockBlob",
"x-ms-date": time.Now().UTC().Format(ms_date_layout),
"x-ms-version": version,
"Accept-Charset": "UTF-8",
"Content-Type": contentType,
"Content-Length": strconv.Itoa(buf.Len()),
})
c.Auth.SignRequest(req)
return c.transport().RoundTrip(req)
}
func escape(content string) string {
content = url.QueryEscape(content)
// the Azure's behavior uses %20 to represent whitespace instead of + (plus)
content = strings.Replace(content, "+", "%20", -1)
// the Azure's behavior uses slash instead of + %2F
content = strings.Replace(content, "%2F", "/", -1)
return content
}
type Auth struct {
Account string
Key string
}
func (a *Auth) SignRequest(req *http.Request) {
strToSign := fmt.Sprintf("%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s\n%s",
strings.ToUpper(req.Method),
tryget(req.Header, "Content-Encoding"),
tryget(req.Header, "Content-Language"),
tryget(req.Header, "Content-Length"),
tryget(req.Header, "Content-MD5"),
tryget(req.Header, "Content-Type"),
tryget(req.Header, "Date"),
tryget(req.Header, "If-Modified-Since"),
tryget(req.Header, "If-Match"),
tryget(req.Header, "If-None-Match"),
tryget(req.Header, "If-Unmodified-Since"),
tryget(req.Header, "Range"),
a.canonicalizedHeaders(req),
a.canonicalizedResource(req),
)
decodedKey, _ := base64.StdEncoding.DecodeString(a.Key)
sha256 := hmac.New(sha256.New, []byte(decodedKey))
sha256.Write([]byte(strToSign))
signature := base64.StdEncoding.EncodeToString(sha256.Sum(nil))
copyHeadersToRequest(req, map[string]string{
"Authorization": fmt.Sprintf("SharedKey %s:%s", a.Account, signature),
})
}
func tryget(headers map[string][]string, key string) string {
// We default to empty string for "0" values to match server side behavior when generating signatures.
if len(headers[key]) > 0 { // && headers[key][0] != "0" { //&& key != "Content-Length" {
return headers[key][0]
}
return ""
}
//
// The following is copied ~95% verbatim from:
// http://github.com/loldesign/azure/ -> core/core.go
//
/*
Based on Azure docs:
Link: http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx#Constructing_Element
1) Retrieve all headers for the resource that begin with x-ms-, including the x-ms-date header.
2) Convert each HTTP header name to lowercase.
3) Sort the headers lexicographically by header name, in ascending order. Note that each header may appear only once in the string.
4) Unfold the string by replacing any breaking white space with a single space.
5) Trim any white space around the colon in the header.
6) Finally, append a new line character to each canonicalized header in the resulting list. Construct the CanonicalizedHeaders string by concatenating all headers in this list into a single string.
*/
func (a *Auth) canonicalizedHeaders(req *http.Request) string {
var buffer bytes.Buffer
for key, value := range req.Header {
lowerKey := strings.ToLower(key)
if strings.HasPrefix(lowerKey, "x-ms-") {
if buffer.Len() == 0 {
buffer.WriteString(fmt.Sprintf("%s:%s", lowerKey, value[0]))
} else {
buffer.WriteString(fmt.Sprintf("\n%s:%s", lowerKey, value[0]))
}
}
}
splitted := strings.Split(buffer.String(), "\n")
sort.Strings(splitted)
return strings.Join(splitted, "\n")
}
/*
Based on Azure docs
Link: http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx#Constructing_Element
1) Beginning with an empty string (""), append a forward slash (/), followed by the name of the account that owns the resource being accessed.
2) Append the resource's encoded URI path, without any query parameters.
3) Retrieve all query parameters on the resource URI, including the comp parameter if it exists.
4) Convert all parameter names to lowercase.
5) Sort the query parameters lexicographically by parameter name, in ascending order.
6) URL-decode each query parameter name and value.
7) Append each query parameter name and value to the string in the following format, making sure to include the colon (:) between the name and the value:
parameter-name:parameter-value
8) If a query parameter has more than one value, sort all values lexicographically, then include them in a comma-separated list:
parameter-name:parameter-value-1,parameter-value-2,parameter-value-n
9) Append a new line character (\n) after each name-value pair.
Rules:
1) Avoid using the new line character (\n) in values for query parameters. If it must be used, ensure that it does not affect the format of the canonicalized resource string.
2) Avoid using commas in query parameter values.
*/
func (a *Auth) canonicalizedResource(req *http.Request) string {
var buffer bytes.Buffer
buffer.WriteString(fmt.Sprintf("/%s%s", a.Account, req.URL.Path))
queries := req.URL.Query()
for key, values := range queries {
sort.Strings(values)
buffer.WriteString(fmt.Sprintf("\n%s:%s", key, strings.Join(values, ",")))
}
splitted := strings.Split(buffer.String(), "\n")
sort.Strings(splitted)
return strings.Join(splitted, "\n")
}

View File

@ -0,0 +1,24 @@
package imguploader
import (
"context"
"testing"
"github.com/grafana/grafana/pkg/setting"
. "github.com/smartystreets/goconvey/convey"
)
func TestUploadToAzureBlob(t *testing.T) {
SkipConvey("[Integration test] for external_image_store.azure_blob", t, func() {
err := setting.NewConfigContext(&setting.CommandLineArgs{
HomePath: "../../../",
})
uploader, _ := NewImageUploader()
path, err := uploader.Upload(context.Background(), "../../../public/img/logo_transparent_400x.png")
So(err, ShouldBeNil)
So(path, ShouldNotEqual, "")
})
}

View File

@ -3,6 +3,7 @@ package imguploader
import ( import (
"context" "context"
"fmt" "fmt"
"github.com/grafana/grafana/pkg/log"
"regexp" "regexp"
"github.com/grafana/grafana/pkg/setting" "github.com/grafana/grafana/pkg/setting"
@ -76,6 +77,21 @@ func NewImageUploader() (ImageUploader, error) {
path := gcssec.Key("path").MustString("") path := gcssec.Key("path").MustString("")
return NewGCSUploader(keyFile, bucketName, path), nil return NewGCSUploader(keyFile, bucketName, path), nil
case "azure_blob":
azureBlobSec, err := setting.Cfg.GetSection("external_image_storage.azure_blob")
if err != nil {
return nil, err
}
account_name := azureBlobSec.Key("account_name").MustString("")
account_key := azureBlobSec.Key("account_key").MustString("")
container_name := azureBlobSec.Key("container_name").MustString("")
return NewAzureBlobUploader(account_name, account_key, container_name), nil
}
if setting.ImageUploadProvider != "" {
log.Error2("The external image storage configuration is invalid", "unsupported provider", setting.ImageUploadProvider)
} }
return NopImageUploader{}, nil return NopImageUploader{}, nil

View File

@ -119,5 +119,29 @@ func TestImageUploaderFactory(t *testing.T) {
So(original.keyFile, ShouldEqual, "/etc/secrets/project-79a52befa3f6.json") So(original.keyFile, ShouldEqual, "/etc/secrets/project-79a52befa3f6.json")
So(original.bucket, ShouldEqual, "project-grafana-east") So(original.bucket, ShouldEqual, "project-grafana-east")
}) })
Convey("AzureBlobUploader config", func() {
setting.NewConfigContext(&setting.CommandLineArgs{
HomePath: "../../../",
})
setting.ImageUploadProvider = "azure_blob"
Convey("with container name", func() {
azureBlobSec, err := setting.Cfg.GetSection("external_image_storage.azure_blob")
azureBlobSec.NewKey("account_name", "account_name")
azureBlobSec.NewKey("account_key", "account_key")
azureBlobSec.NewKey("container_name", "container_name")
uploader, err := NewImageUploader()
So(err, ShouldBeNil)
original, ok := uploader.(*AzureBlobUploader)
So(ok, ShouldBeTrue)
So(original.account_name, ShouldEqual, "account_name")
So(original.account_key, ShouldEqual, "account_key")
So(original.container_name, ShouldEqual, "container_name")
})
})
}) })
} }

View File

@ -139,8 +139,12 @@ func (dash *Dashboard) GetString(prop string, defaultValue string) string {
// UpdateSlug updates the slug // UpdateSlug updates the slug
func (dash *Dashboard) UpdateSlug() { func (dash *Dashboard) UpdateSlug() {
title := strings.ToLower(dash.Data.Get("title").MustString()) title := dash.Data.Get("title").MustString()
dash.Slug = slug.Make(title) dash.Slug = SlugifyTitle(title)
}
func SlugifyTitle(title string) string {
return slug.Make(strings.ToLower(title))
} }
// //

View File

@ -16,6 +16,12 @@ func TestDashboardModel(t *testing.T) {
So(dashboard.Slug, ShouldEqual, "grafana-play-home") So(dashboard.Slug, ShouldEqual, "grafana-play-home")
}) })
Convey("Can slugify title", t, func() {
slug := SlugifyTitle("Grafana Play Home")
So(slug, ShouldEqual, "grafana-play-home")
})
Convey("Given a dashboard json", t, func() { Convey("Given a dashboard json", t, func() {
json := simplejson.New() json := simplejson.New()
json.Set("title", "test dash") json.Set("title", "test dash")

View File

@ -16,6 +16,7 @@ type Team struct {
Id int64 `json:"id"` Id int64 `json:"id"`
OrgId int64 `json:"orgId"` OrgId int64 `json:"orgId"`
Name string `json:"name"` Name string `json:"name"`
Email string `json:"email"`
Created time.Time `json:"created"` Created time.Time `json:"created"`
Updated time.Time `json:"updated"` Updated time.Time `json:"updated"`
@ -26,14 +27,16 @@ type Team struct {
type CreateTeamCommand struct { type CreateTeamCommand struct {
Name string `json:"name" binding:"Required"` Name string `json:"name" binding:"Required"`
Email string `json:"email"`
OrgId int64 `json:"-"` OrgId int64 `json:"-"`
Result Team `json:"-"` Result Team `json:"-"`
} }
type UpdateTeamCommand struct { type UpdateTeamCommand struct {
Id int64 Id int64
Name string Name string
Email string
} }
type DeleteTeamCommand struct { type DeleteTeamCommand struct {
@ -64,6 +67,8 @@ type SearchTeamDto struct {
Id int64 `json:"id"` Id int64 `json:"id"`
OrgId int64 `json:"orgId"` OrgId int64 `json:"orgId"`
Name string `json:"name"` Name string `json:"name"`
Email string `json:"email"`
AvatarUrl string `json:"avatarUrl"`
MemberCount int64 `json:"memberCount"` MemberCount int64 `json:"memberCount"`
} }

View File

@ -47,9 +47,10 @@ type GetTeamMembersQuery struct {
// Projections and DTOs // Projections and DTOs
type TeamMemberDTO struct { type TeamMemberDTO struct {
OrgId int64 `json:"orgId"` OrgId int64 `json:"orgId"`
TeamId int64 `json:"teamId"` TeamId int64 `json:"teamId"`
UserId int64 `json:"userId"` UserId int64 `json:"userId"`
Email string `json:"email"` Email string `json:"email"`
Login string `json:"login"` Login string `json:"login"`
AvatarUrl string `json:"avatarUrl"`
} }

View File

@ -40,7 +40,7 @@ func getPluginLogoUrl(pluginType, path, baseUrl string) string {
} }
func (fp *FrontendPluginBase) setPathsBasedOnApp(app *AppPlugin) { func (fp *FrontendPluginBase) setPathsBasedOnApp(app *AppPlugin) {
appSubPath := strings.Replace(strings.Replace(fp.PluginDir, app.PluginDir, "", 1), "\\", "/", 1) appSubPath := strings.Replace(strings.Replace(fp.PluginDir, app.PluginDir, "", 1), "\\", "/", -1)
fp.IncludedInAppId = app.Id fp.IncludedInAppId = app.Id
fp.BaseUrl = app.BaseUrl fp.BaseUrl = app.BaseUrl

View File

@ -14,7 +14,7 @@ func TestFrontendPlugin(t *testing.T) {
fp := &FrontendPluginBase{ fp := &FrontendPluginBase{
PluginBase: PluginBase{ PluginBase: PluginBase{
PluginDir: "c:\\grafana\\public\\app\\plugins\\app\\testdata\\datasource", PluginDir: "c:\\grafana\\public\\app\\plugins\\app\\testdata\\datasources\\datasource",
BaseUrl: "fpbase", BaseUrl: "fpbase",
}, },
} }
@ -29,6 +29,6 @@ func TestFrontendPlugin(t *testing.T) {
} }
fp.setPathsBasedOnApp(app) fp.setPathsBasedOnApp(app)
So(fp.Module, ShouldEqual, "app/plugins/app/testdata/datasource/module") So(fp.Module, ShouldEqual, "app/plugins/app/testdata/datasources/datasource/module")
}) })
} }

View File

@ -42,7 +42,7 @@ var (
) )
func NewPagerdutyNotifier(model *m.AlertNotification) (alerting.Notifier, error) { func NewPagerdutyNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
autoResolve := model.Settings.Get("autoResolve").MustBool(true) autoResolve := model.Settings.Get("autoResolve").MustBool(false)
key := model.Settings.Get("integrationKey").MustString() key := model.Settings.Get("integrationKey").MustString()
if key == "" { if key == "" {
return nil, alerting.ValidationError{Reason: "Could not find integration key property in settings"} return nil, alerting.ValidationError{Reason: "Could not find integration key property in settings"}

View File

@ -10,7 +10,6 @@ import (
func TestPagerdutyNotifier(t *testing.T) { func TestPagerdutyNotifier(t *testing.T) {
Convey("Pagerduty notifier tests", t, func() { Convey("Pagerduty notifier tests", t, func() {
Convey("Parsing alert notification from settings", func() { Convey("Parsing alert notification from settings", func() {
Convey("empty settings should return error", func() { Convey("empty settings should return error", func() {
json := `{ }` json := `{ }`
@ -26,10 +25,31 @@ func TestPagerdutyNotifier(t *testing.T) {
So(err, ShouldNotBeNil) So(err, ShouldNotBeNil)
}) })
Convey("auto resolve should default to false", func() {
json := `{ "integrationKey": "abcdefgh0123456789" }`
settingsJSON, _ := simplejson.NewJson([]byte(json))
model := &m.AlertNotification{
Name: "pagerduty_testing",
Type: "pagerduty",
Settings: settingsJSON,
}
not, err := NewPagerdutyNotifier(model)
pagerdutyNotifier := not.(*PagerdutyNotifier)
So(err, ShouldBeNil)
So(pagerdutyNotifier.Name, ShouldEqual, "pagerduty_testing")
So(pagerdutyNotifier.Type, ShouldEqual, "pagerduty")
So(pagerdutyNotifier.Key, ShouldEqual, "abcdefgh0123456789")
So(pagerdutyNotifier.AutoResolve, ShouldBeFalse)
})
Convey("settings should trigger incident", func() { Convey("settings should trigger incident", func() {
json := ` json := `
{ {
"integrationKey": "abcdefgh0123456789" "integrationKey": "abcdefgh0123456789",
"autoResolve": false
}` }`
settingsJSON, _ := simplejson.NewJson([]byte(json)) settingsJSON, _ := simplejson.NewJson([]byte(json))
@ -46,8 +66,8 @@ func TestPagerdutyNotifier(t *testing.T) {
So(pagerdutyNotifier.Name, ShouldEqual, "pagerduty_testing") So(pagerdutyNotifier.Name, ShouldEqual, "pagerduty_testing")
So(pagerdutyNotifier.Type, ShouldEqual, "pagerduty") So(pagerdutyNotifier.Type, ShouldEqual, "pagerduty")
So(pagerdutyNotifier.Key, ShouldEqual, "abcdefgh0123456789") So(pagerdutyNotifier.Key, ShouldEqual, "abcdefgh0123456789")
So(pagerdutyNotifier.AutoResolve, ShouldBeFalse)
}) })
}) })
}) })
} }

View File

@ -1,17 +1,19 @@
package notifiers package notifiers
import ( import (
"bytes"
"fmt" "fmt"
"github.com/grafana/grafana/pkg/bus" "github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/log" "github.com/grafana/grafana/pkg/log"
m "github.com/grafana/grafana/pkg/models" m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/alerting" "github.com/grafana/grafana/pkg/services/alerting"
"io"
"mime/multipart"
"os"
) )
var ( var (
telegeramApiUrl string = "https://api.telegram.org/bot%s/%s" telegramApiUrl string = "https://api.telegram.org/bot%s/%s"
) )
func init() { func init() {
@ -47,9 +49,10 @@ func init() {
type TelegramNotifier struct { type TelegramNotifier struct {
NotifierBase NotifierBase
BotToken string BotToken string
ChatID string ChatID string
log log.Logger UploadImage bool
log log.Logger
} }
func NewTelegramNotifier(model *m.AlertNotification) (alerting.Notifier, error) { func NewTelegramNotifier(model *m.AlertNotification) (alerting.Notifier, error) {
@ -59,6 +62,7 @@ func NewTelegramNotifier(model *m.AlertNotification) (alerting.Notifier, error)
botToken := model.Settings.Get("bottoken").MustString() botToken := model.Settings.Get("bottoken").MustString()
chatId := model.Settings.Get("chatid").MustString() chatId := model.Settings.Get("chatid").MustString()
uploadImage := model.Settings.Get("uploadImage").MustBool()
if botToken == "" { if botToken == "" {
return nil, alerting.ValidationError{Reason: "Could not find Bot Token in settings"} return nil, alerting.ValidationError{Reason: "Could not find Bot Token in settings"}
@ -72,31 +76,42 @@ func NewTelegramNotifier(model *m.AlertNotification) (alerting.Notifier, error)
NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings), NotifierBase: NewNotifierBase(model.Id, model.IsDefault, model.Name, model.Type, model.Settings),
BotToken: botToken, BotToken: botToken,
ChatID: chatId, ChatID: chatId,
UploadImage: uploadImage,
log: log.New("alerting.notifier.telegram"), log: log.New("alerting.notifier.telegram"),
}, nil }, nil
} }
func (this *TelegramNotifier) ShouldNotify(context *alerting.EvalContext) bool { func (this *TelegramNotifier) buildMessage(evalContext *alerting.EvalContext, sendImageInline bool) *m.SendWebhookSync {
return defaultShouldNotify(context) var imageFile *os.File
} var err error
func (this *TelegramNotifier) Notify(evalContext *alerting.EvalContext) error { if sendImageInline {
this.log.Info("Sending alert notification to", "bot_token", this.BotToken) imageFile, err = os.Open(evalContext.ImageOnDiskPath)
this.log.Info("Sending alert notification to", "chat_id", this.ChatID) defer imageFile.Close()
if err != nil {
sendImageInline = false // fall back to text message
}
}
bodyJSON := simplejson.New() message := ""
bodyJSON.Set("chat_id", this.ChatID) if sendImageInline {
bodyJSON.Set("parse_mode", "html") // Telegram's API does not allow HTML formatting for image captions.
message = fmt.Sprintf("%s\nState: %s\nMessage: %s\n", evalContext.GetNotificationTitle(), evalContext.Rule.Name, evalContext.Rule.Message)
message := fmt.Sprintf("<b>%s</b>\nState: %s\nMessage: %s\n", evalContext.GetNotificationTitle(), evalContext.Rule.Name, evalContext.Rule.Message) } else {
message = fmt.Sprintf("<b>%s</b>\nState: %s\nMessage: %s\n", evalContext.GetNotificationTitle(), evalContext.Rule.Name, evalContext.Rule.Message)
}
ruleUrl, err := evalContext.GetRuleUrl() ruleUrl, err := evalContext.GetRuleUrl()
if err == nil { if err == nil {
message = message + fmt.Sprintf("URL: %s\n", ruleUrl) message = message + fmt.Sprintf("URL: %s\n", ruleUrl)
} }
if evalContext.ImagePublicUrl != "" {
message = message + fmt.Sprintf("Image: %s\n", evalContext.ImagePublicUrl) if !sendImageInline {
// only attach this if we are not sending it inline.
if evalContext.ImagePublicUrl != "" {
message = message + fmt.Sprintf("Image: %s\n", evalContext.ImagePublicUrl)
}
} }
metrics := "" metrics := ""
@ -107,19 +122,69 @@ func (this *TelegramNotifier) Notify(evalContext *alerting.EvalContext) error {
break break
} }
} }
if metrics != "" { if metrics != "" {
message = message + fmt.Sprintf("\n<i>Metrics:</i>%s", metrics) if sendImageInline {
// Telegram's API does not allow HTML formatting for image captions.
message = message + fmt.Sprintf("\nMetrics:%s", metrics)
} else {
message = message + fmt.Sprintf("\n<i>Metrics:</i>%s", metrics)
}
} }
bodyJSON.Set("text", message) var body bytes.Buffer
url := fmt.Sprintf(telegeramApiUrl, this.BotToken, "sendMessage") w := multipart.NewWriter(&body)
body, _ := bodyJSON.MarshalJSON() fw, _ := w.CreateFormField("chat_id")
fw.Write([]byte(this.ChatID))
if sendImageInline {
fw, _ = w.CreateFormField("caption")
fw.Write([]byte(message))
fw, _ = w.CreateFormFile("photo", evalContext.ImageOnDiskPath)
io.Copy(fw, imageFile)
} else {
fw, _ = w.CreateFormField("text")
fw.Write([]byte(message))
fw, _ = w.CreateFormField("parse_mode")
fw.Write([]byte("html"))
}
w.Close()
apiMethod := ""
if sendImageInline {
this.log.Info("Sending telegram image notification", "photo", evalContext.ImageOnDiskPath, "chat_id", this.ChatID, "bot_token", this.BotToken)
apiMethod = "sendPhoto"
} else {
this.log.Info("Sending telegram text notification", "chat_id", this.ChatID, "bot_token", this.BotToken)
apiMethod = "sendMessage"
}
url := fmt.Sprintf(telegramApiUrl, this.BotToken, apiMethod)
cmd := &m.SendWebhookSync{ cmd := &m.SendWebhookSync{
Url: url, Url: url,
Body: string(body), Body: body.String(),
HttpMethod: "POST", HttpMethod: "POST",
HttpHeader: map[string]string{
"Content-Type": w.FormDataContentType(),
},
}
return cmd
}
func (this *TelegramNotifier) ShouldNotify(context *alerting.EvalContext) bool {
return defaultShouldNotify(context)
}
func (this *TelegramNotifier) Notify(evalContext *alerting.EvalContext) error {
var cmd *m.SendWebhookSync
if evalContext.ImagePublicUrl == "" && this.UploadImage == true {
cmd = this.buildMessage(evalContext, true)
} else {
cmd = this.buildMessage(evalContext, false)
} }
if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil { if err := bus.DispatchCtx(evalContext.Ctx, cmd); err != nil {

View File

@ -0,0 +1,33 @@
package dashboards
import (
"github.com/grafana/grafana/pkg/services/dashboards"
gocache "github.com/patrickmn/go-cache"
"time"
)
type dashboardCache struct {
internalCache *gocache.Cache
}
func NewDashboardCache() *dashboardCache {
return &dashboardCache{internalCache: gocache.New(5*time.Minute, 30*time.Minute)}
}
func (fr *dashboardCache) addDashboardCache(key string, json *dashboards.SaveDashboardItem) {
fr.internalCache.Add(key, json, time.Minute*10)
}
func (fr *dashboardCache) getCache(key string) (*dashboards.SaveDashboardItem, bool) {
obj, exist := fr.internalCache.Get(key)
if !exist {
return nil, exist
}
dash, ok := obj.(*dashboards.SaveDashboardItem)
if !ok {
return nil, ok
}
return dash, ok
}

View File

@ -2,6 +2,7 @@ package dashboards
import ( import (
"context" "context"
"errors"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
@ -15,7 +16,12 @@ import (
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/log" "github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
gocache "github.com/patrickmn/go-cache" )
var (
checkDiskForChangesInterval time.Duration = time.Second * 3
ErrFolderNameMissing error = errors.New("Folder name missing")
) )
type fileReader struct { type fileReader struct {
@ -23,7 +29,8 @@ type fileReader struct {
Path string Path string
log log.Logger log log.Logger
dashboardRepo dashboards.Repository dashboardRepo dashboards.Repository
cache *gocache.Cache cache *dashboardCache
createWalk func(fr *fileReader, folderId int64) filepath.WalkFunc
} }
func NewDashboardFileReader(cfg *DashboardsAsConfig, log log.Logger) (*fileReader, error) { func NewDashboardFileReader(cfg *DashboardsAsConfig, log log.Logger) (*fileReader, error) {
@ -41,32 +48,15 @@ func NewDashboardFileReader(cfg *DashboardsAsConfig, log log.Logger) (*fileReade
Path: path, Path: path,
log: log, log: log,
dashboardRepo: dashboards.GetRepository(), dashboardRepo: dashboards.GetRepository(),
cache: gocache.New(5*time.Minute, 30*time.Minute), cache: NewDashboardCache(),
createWalk: createWalkFn,
}, nil }, nil
} }
func (fr *fileReader) addCache(key string, json *dashboards.SaveDashboardItem) {
fr.cache.Add(key, json, time.Minute*10)
}
func (fr *fileReader) getCache(key string) (*dashboards.SaveDashboardItem, bool) {
obj, exist := fr.cache.Get(key)
if !exist {
return nil, exist
}
dash, ok := obj.(*dashboards.SaveDashboardItem)
if !ok {
return nil, ok
}
return dash, ok
}
func (fr *fileReader) ReadAndListen(ctx context.Context) error { func (fr *fileReader) ReadAndListen(ctx context.Context) error {
ticker := time.NewTicker(time.Second * 3) ticker := time.NewTicker(checkDiskForChangesInterval)
if err := fr.walkFolder(); err != nil { if err := fr.startWalkingDisk(); err != nil {
fr.log.Error("failed to search for dashboards", "error", err) fr.log.Error("failed to search for dashboards", "error", err)
} }
@ -78,7 +68,9 @@ func (fr *fileReader) ReadAndListen(ctx context.Context) error {
if !running { // avoid walking the filesystem in parallel. incase fs is very slow. if !running { // avoid walking the filesystem in parallel. incase fs is very slow.
running = true running = true
go func() { go func() {
fr.walkFolder() if err := fr.startWalkingDisk(); err != nil {
fr.log.Error("failed to search for dashboards", "error", err)
}
running = false running = false
}() }()
} }
@ -88,14 +80,57 @@ func (fr *fileReader) ReadAndListen(ctx context.Context) error {
} }
} }
func (fr *fileReader) walkFolder() error { func (fr *fileReader) startWalkingDisk() error {
if _, err := os.Stat(fr.Path); err != nil { if _, err := os.Stat(fr.Path); err != nil {
if os.IsNotExist(err) { if os.IsNotExist(err) {
return err return err
} }
} }
return filepath.Walk(fr.Path, func(path string, fileInfo os.FileInfo, err error) error { folderId, err := getOrCreateFolderId(fr.Cfg, fr.dashboardRepo)
if err != nil && err != ErrFolderNameMissing {
return err
}
return filepath.Walk(fr.Path, fr.createWalk(fr, folderId))
}
func getOrCreateFolderId(cfg *DashboardsAsConfig, repo dashboards.Repository) (int64, error) {
if cfg.Folder == "" {
return 0, ErrFolderNameMissing
}
cmd := &models.GetDashboardQuery{Slug: models.SlugifyTitle(cfg.Folder), OrgId: cfg.OrgId}
err := bus.Dispatch(cmd)
if err != nil && err != models.ErrDashboardNotFound {
return 0, err
}
// dashboard folder not found. create one.
if err == models.ErrDashboardNotFound {
dash := &dashboards.SaveDashboardItem{}
dash.Dashboard = models.NewDashboard(cfg.Folder)
dash.Dashboard.IsFolder = true
dash.Overwrite = true
dash.OrgId = cfg.OrgId
dbDash, err := repo.SaveDashboard(dash)
if err != nil {
return 0, err
}
return dbDash.Id, nil
}
if !cmd.Result.IsFolder {
return 0, fmt.Errorf("Got invalid response. Expected folder, found dashboard")
}
return cmd.Result.Id, nil
}
func createWalkFn(fr *fileReader, folderId int64) filepath.WalkFunc {
return func(path string, fileInfo os.FileInfo, err error) error {
if err != nil { if err != nil {
return err return err
} }
@ -110,12 +145,12 @@ func (fr *fileReader) walkFolder() error {
return nil return nil
} }
cachedDashboard, exist := fr.getCache(path) cachedDashboard, exist := fr.cache.getCache(path)
if exist && cachedDashboard.UpdatedAt == fileInfo.ModTime() { if exist && cachedDashboard.UpdatedAt == fileInfo.ModTime() {
return nil return nil
} }
dash, err := fr.readDashboardFromFile(path) dash, err := fr.readDashboardFromFile(path, folderId)
if err != nil { if err != nil {
fr.log.Error("failed to load dashboard from ", "file", path, "error", err) fr.log.Error("failed to load dashboard from ", "file", path, "error", err)
return nil return nil
@ -147,10 +182,10 @@ func (fr *fileReader) walkFolder() error {
fr.log.Debug("loading dashboard from disk into database.", "file", path) fr.log.Debug("loading dashboard from disk into database.", "file", path)
_, err = fr.dashboardRepo.SaveDashboard(dash) _, err = fr.dashboardRepo.SaveDashboard(dash)
return err return err
}) }
} }
func (fr *fileReader) readDashboardFromFile(path string) (*dashboards.SaveDashboardItem, error) { func (fr *fileReader) readDashboardFromFile(path string, folderId int64) (*dashboards.SaveDashboardItem, error) {
reader, err := os.Open(path) reader, err := os.Open(path)
if err != nil { if err != nil {
return nil, err return nil, err
@ -167,12 +202,12 @@ func (fr *fileReader) readDashboardFromFile(path string) (*dashboards.SaveDashbo
return nil, err return nil, err
} }
dash, err := createDashboardJson(data, stat.ModTime(), fr.Cfg) dash, err := createDashboardJson(data, stat.ModTime(), fr.Cfg, folderId)
if err != nil { if err != nil {
return nil, err return nil, err
} }
fr.addCache(path, dash) fr.cache.addDashboardCache(path, dash)
return dash, nil return dash, nil
} }

View File

@ -2,6 +2,7 @@ package dashboards
import ( import (
"os" "os"
"path/filepath"
"testing" "testing"
"time" "time"
@ -22,7 +23,7 @@ var (
) )
func TestDashboardFileReader(t *testing.T) { func TestDashboardFileReader(t *testing.T) {
Convey("Reading dashboards from disk", t, func() { Convey("Dashboard file reader", t, func() {
bus.ClearBusHandlers() bus.ClearBusHandlers()
fakeRepo = &fakeDashboardRepo{} fakeRepo = &fakeDashboardRepo{}
@ -30,91 +31,191 @@ func TestDashboardFileReader(t *testing.T) {
dashboards.SetRepository(fakeRepo) dashboards.SetRepository(fakeRepo)
logger := log.New("test.logger") logger := log.New("test.logger")
cfg := &DashboardsAsConfig{ Convey("Reading dashboards from disk", func() {
Name: "Default",
Type: "file",
OrgId: 1,
Folder: "",
Options: map[string]interface{}{},
}
Convey("Can read default dashboard", func() {
cfg.Options["folder"] = defaultDashboards
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.walkFolder()
So(err, ShouldBeNil)
So(len(fakeRepo.inserted), ShouldEqual, 2)
})
Convey("Should not update dashboards when db is newer", func() {
cfg.Options["folder"] = oneDashboard
fakeRepo.getDashboard = append(fakeRepo.getDashboard, &models.Dashboard{
Updated: time.Now().Add(time.Hour),
Slug: "grafana",
})
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.walkFolder()
So(err, ShouldBeNil)
So(len(fakeRepo.inserted), ShouldEqual, 0)
})
Convey("Can read default dashboard and replace old version in database", func() {
cfg.Options["folder"] = oneDashboard
stat, _ := os.Stat(oneDashboard + "/dashboard1.json")
fakeRepo.getDashboard = append(fakeRepo.getDashboard, &models.Dashboard{
Updated: stat.ModTime().AddDate(0, 0, -1),
Slug: "grafana",
})
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.walkFolder()
So(err, ShouldBeNil)
So(len(fakeRepo.inserted), ShouldEqual, 1)
})
Convey("Invalid configuration should return error", func() {
cfg := &DashboardsAsConfig{ cfg := &DashboardsAsConfig{
Name: "Default", Name: "Default",
Type: "file", Type: "file",
OrgId: 1, OrgId: 1,
Folder: "", Folder: "",
Options: map[string]interface{}{},
} }
_, err := NewDashboardFileReader(cfg, logger) Convey("Can read default dashboard", func() {
So(err, ShouldNotBeNil) cfg.Options["folder"] = defaultDashboards
cfg.Folder = "Team A"
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.startWalkingDisk()
So(err, ShouldBeNil)
folders := 0
dashboards := 0
for _, i := range fakeRepo.inserted {
if i.Dashboard.IsFolder {
folders++
} else {
dashboards++
}
}
So(dashboards, ShouldEqual, 2)
So(folders, ShouldEqual, 1)
})
Convey("Should not update dashboards when db is newer", func() {
cfg.Options["folder"] = oneDashboard
fakeRepo.getDashboard = append(fakeRepo.getDashboard, &models.Dashboard{
Updated: time.Now().Add(time.Hour),
Slug: "grafana",
})
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.startWalkingDisk()
So(err, ShouldBeNil)
So(len(fakeRepo.inserted), ShouldEqual, 0)
})
Convey("Can read default dashboard and replace old version in database", func() {
cfg.Options["folder"] = oneDashboard
stat, _ := os.Stat(oneDashboard + "/dashboard1.json")
fakeRepo.getDashboard = append(fakeRepo.getDashboard, &models.Dashboard{
Updated: stat.ModTime().AddDate(0, 0, -1),
Slug: "grafana",
})
reader, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
err = reader.startWalkingDisk()
So(err, ShouldBeNil)
So(len(fakeRepo.inserted), ShouldEqual, 1)
})
Convey("Invalid configuration should return error", func() {
cfg := &DashboardsAsConfig{
Name: "Default",
Type: "file",
OrgId: 1,
Folder: "",
}
_, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldNotBeNil)
})
Convey("Broken dashboards should not cause error", func() {
cfg.Options["folder"] = brokenDashboards
_, err := NewDashboardFileReader(cfg, logger)
So(err, ShouldBeNil)
})
}) })
Convey("Broken dashboards should not cause error", func() { Convey("Should not create new folder if folder name is missing", func() {
cfg := &DashboardsAsConfig{ cfg := &DashboardsAsConfig{
Name: "Default", Name: "Default",
Type: "file", Type: "file",
OrgId: 1, OrgId: 1,
Folder: "", Folder: "",
Options: map[string]interface{}{ Options: map[string]interface{}{
"folder": brokenDashboards, "folder": defaultDashboards,
}, },
} }
_, err := NewDashboardFileReader(cfg, logger) _, err := getOrCreateFolderId(cfg, fakeRepo)
So(err, ShouldEqual, ErrFolderNameMissing)
})
Convey("can get or Create dashboard folder", func() {
cfg := &DashboardsAsConfig{
Name: "Default",
Type: "file",
OrgId: 1,
Folder: "TEAM A",
Options: map[string]interface{}{
"folder": defaultDashboards,
},
}
folderId, err := getOrCreateFolderId(cfg, fakeRepo)
So(err, ShouldBeNil) So(err, ShouldBeNil)
inserted := false
for _, d := range fakeRepo.inserted {
if d.Dashboard.IsFolder && d.Dashboard.Id == folderId {
inserted = true
}
}
So(len(fakeRepo.inserted), ShouldEqual, 1)
So(inserted, ShouldBeTrue)
})
Convey("Walking the folder with dashboards", func() {
cfg := &DashboardsAsConfig{
Name: "Default",
Type: "file",
OrgId: 1,
Folder: "",
Options: map[string]interface{}{
"folder": defaultDashboards,
},
}
reader, err := NewDashboardFileReader(cfg, log.New("test-logger"))
So(err, ShouldBeNil)
Convey("should skip dirs that starts with .", func() {
shouldSkip := reader.createWalk(reader, 0)("path", &FakeFileInfo{isDirectory: true, name: ".folder"}, nil)
So(shouldSkip, ShouldEqual, filepath.SkipDir)
})
Convey("should keep walking if file is not .json", func() {
shouldSkip := reader.createWalk(reader, 0)("path", &FakeFileInfo{isDirectory: true, name: "folder"}, nil)
So(shouldSkip, ShouldBeNil)
})
}) })
}) })
} }
type FakeFileInfo struct {
isDirectory bool
name string
}
func (ffi *FakeFileInfo) IsDir() bool {
return ffi.isDirectory
}
func (ffi FakeFileInfo) Size() int64 {
return 1
}
func (ffi FakeFileInfo) Mode() os.FileMode {
return 0777
}
func (ffi FakeFileInfo) Name() string {
return ffi.name
}
func (ffi FakeFileInfo) ModTime() time.Time {
return time.Time{}
}
func (ffi FakeFileInfo) Sys() interface{} {
return nil
}
type fakeDashboardRepo struct { type fakeDashboardRepo struct {
inserted []*dashboards.SaveDashboardItem inserted []*dashboards.SaveDashboardItem
getDashboard []*models.Dashboard getDashboard []*models.Dashboard

View File

@ -18,14 +18,16 @@ type DashboardsAsConfig struct {
Options map[string]interface{} `json:"options" yaml:"options"` Options map[string]interface{} `json:"options" yaml:"options"`
} }
func createDashboardJson(data *simplejson.Json, lastModified time.Time, cfg *DashboardsAsConfig) (*dashboards.SaveDashboardItem, error) { func createDashboardJson(data *simplejson.Json, lastModified time.Time, cfg *DashboardsAsConfig, folderId int64) (*dashboards.SaveDashboardItem, error) {
dash := &dashboards.SaveDashboardItem{} dash := &dashboards.SaveDashboardItem{}
dash.Dashboard = models.NewDashboardFromJson(data) dash.Dashboard = models.NewDashboardFromJson(data)
dash.UpdatedAt = lastModified dash.UpdatedAt = lastModified
dash.Overwrite = true dash.Overwrite = true
dash.OrgId = cfg.OrgId dash.OrgId = cfg.OrgId
dash.Dashboard.Data.Set("editable", cfg.Editable) dash.Dashboard.FolderId = folderId
if !cfg.Editable {
dash.Dashboard.Data.Set("editable", cfg.Editable)
}
if dash.Dashboard.Title == "" { if dash.Dashboard.Title == "" {
return nil, models.ErrDashboardTitleEmpty return nil, models.ErrDashboardTitleEmpty

View File

@ -88,7 +88,7 @@ func HandleAlertsQuery(query *m.GetAlertsQuery) error {
params = append(params, query.PanelId) params = append(params, query.PanelId)
} }
if len(query.State) > 0 && query.State[0] != "ALL" { if len(query.State) > 0 && query.State[0] != "all" {
sql.WriteString(` AND (`) sql.WriteString(` AND (`)
for i, v := range query.State { for i, v := range query.State {
if i > 0 { if i > 0 {

View File

@ -45,4 +45,9 @@ func addTeamMigrations(mg *Migrator) {
//------- indexes ------------------ //------- indexes ------------------
mg.AddMigration("add index team_member.org_id", NewAddIndexMigration(teamMemberV1, teamMemberV1.Indices[0])) mg.AddMigration("add index team_member.org_id", NewAddIndexMigration(teamMemberV1, teamMemberV1.Indices[0]))
mg.AddMigration("add unique index team_member_org_id_team_id_user_id", NewAddIndexMigration(teamMemberV1, teamMemberV1.Indices[1])) mg.AddMigration("add unique index team_member_org_id_team_id_user_id", NewAddIndexMigration(teamMemberV1, teamMemberV1.Indices[1]))
// add column email
mg.AddMigration("Add column email to team table", NewAddColumnMigration(teamV1, &Column{
Name: "email", Type: DB_NVarchar, Nullable: true, Length: 190,
}))
} }

View File

@ -13,7 +13,7 @@ func init() {
bus.AddHandler("sql", GetAdminStats) bus.AddHandler("sql", GetAdminStats)
} }
var activeUserTimeLimit time.Duration = time.Hour * 24 * 14 var activeUserTimeLimit time.Duration = time.Hour * 24 * 30
func GetDataSourceStats(query *m.GetDataSourceStatsQuery) error { func GetDataSourceStats(query *m.GetDataSourceStatsQuery) error {
var rawSql = `SELECT COUNT(*) as count, type FROM data_source GROUP BY type` var rawSql = `SELECT COUNT(*) as count, type FROM data_source GROUP BY type`

View File

@ -33,6 +33,7 @@ func CreateTeam(cmd *m.CreateTeamCommand) error {
team := m.Team{ team := m.Team{
Name: cmd.Name, Name: cmd.Name,
Email: cmd.Email,
OrgId: cmd.OrgId, OrgId: cmd.OrgId,
Created: time.Now(), Created: time.Now(),
Updated: time.Now(), Updated: time.Now(),
@ -57,9 +58,12 @@ func UpdateTeam(cmd *m.UpdateTeamCommand) error {
team := m.Team{ team := m.Team{
Name: cmd.Name, Name: cmd.Name,
Email: cmd.Email,
Updated: time.Now(), Updated: time.Now(),
} }
sess.MustCols("email")
affectedRows, err := sess.Id(cmd.Id).Update(&team) affectedRows, err := sess.Id(cmd.Id).Update(&team)
if err != nil { if err != nil {
@ -125,6 +129,7 @@ func SearchTeams(query *m.SearchTeamsQuery) error {
sql.WriteString(`select sql.WriteString(`select
team.id as id, team.id as id,
team.name as name, team.name as name,
team.email as email,
(select count(*) from team_member where team_member.team_id = team.id) as member_count (select count(*) from team_member where team_member.team_id = team.id) as member_count
from team as team from team as team
where team.org_id = ?`) where team.org_id = ?`)

View File

@ -27,8 +27,8 @@ func TestTeamCommandsAndQueries(t *testing.T) {
userIds = append(userIds, userCmd.Result.Id) userIds = append(userIds, userCmd.Result.Id)
} }
group1 := m.CreateTeamCommand{Name: "group1 name"} group1 := m.CreateTeamCommand{Name: "group1 name", Email: "test1@test.com"}
group2 := m.CreateTeamCommand{Name: "group2 name"} group2 := m.CreateTeamCommand{Name: "group2 name", Email: "test2@test.com"}
err := CreateTeam(&group1) err := CreateTeam(&group1)
So(err, ShouldBeNil) So(err, ShouldBeNil)
@ -43,6 +43,7 @@ func TestTeamCommandsAndQueries(t *testing.T) {
team1 := query.Result.Teams[0] team1 := query.Result.Teams[0]
So(team1.Name, ShouldEqual, "group1 name") So(team1.Name, ShouldEqual, "group1 name")
So(team1.Email, ShouldEqual, "test1@test.com")
err = AddTeamMember(&m.AddTeamMemberCommand{OrgId: 1, TeamId: team1.Id, UserId: userIds[0]}) err = AddTeamMember(&m.AddTeamMemberCommand{OrgId: 1, TeamId: team1.Id, UserId: userIds[0]})
So(err, ShouldBeNil) So(err, ShouldBeNil)
@ -76,6 +77,7 @@ func TestTeamCommandsAndQueries(t *testing.T) {
So(err, ShouldBeNil) So(err, ShouldBeNil)
So(len(query.Result), ShouldEqual, 1) So(len(query.Result), ShouldEqual, 1)
So(query.Result[0].Name, ShouldEqual, "group2 name") So(query.Result[0].Name, ShouldEqual, "group2 name")
So(query.Result[0].Email, ShouldEqual, "test2@test.com")
}) })
Convey("Should be able to remove users from a group", func() { Convey("Should be able to remove users from a group", func() {

View File

@ -600,7 +600,7 @@ func NewConfigContext(args *CommandLineArgs) error {
readQuotaSettings() readQuotaSettings()
if VerifyEmailEnabled && !Smtp.Enabled { if VerifyEmailEnabled && !Smtp.Enabled {
log.Warn("require_email_validation is enabled but smpt is disabled") log.Warn("require_email_validation is enabled but smtp is disabled")
} }
// check old key name // check old key name
@ -610,7 +610,7 @@ func NewConfigContext(args *CommandLineArgs) error {
} }
imageUploadingSection := Cfg.Section("external_image_storage") imageUploadingSection := Cfg.Section("external_image_storage")
ImageUploadProvider = imageUploadingSection.Key("provider").MustString("internal") ImageUploadProvider = imageUploadingSection.Key("provider").MustString("")
return nil return nil
} }

View File

@ -37,6 +37,7 @@ var customMetricsDimensionsMap map[string]map[string]map[string]*CustomMetricsCa
func init() { func init() {
metricsMap = map[string][]string{ metricsMap = map[string][]string{
"AWS/AmazonMQ": {"CpuUtilization", "HeapUsage", "NetworkIn", "NetworkOut", "TotalMessageCount", "ConsumerCount", "EnqueueCount", "EnqueueTime", "ExpiredCount", "InflightCount", "DispatchCount", "DequeueCount", "MemoryUsage", "ProducerCount", "QueueSize"},
"AWS/ApiGateway": {"4XXError", "5XXError", "CacheHitCount", "CacheMissCount", "Count", "IntegrationLatency", "Latency"}, "AWS/ApiGateway": {"4XXError", "5XXError", "CacheHitCount", "CacheMissCount", "Count", "IntegrationLatency", "Latency"},
"AWS/ApplicationELB": {"ActiveConnectionCount", "ClientTLSNegotiationErrorCount", "HealthyHostCount", "HTTPCode_ELB_4XX_Count", "HTTPCode_ELB_5XX_Count", "HTTPCode_Target_2XX_Count", "HTTPCode_Target_3XX_Count", "HTTPCode_Target_4XX_Count", "HTTPCode_Target_5XX_Count", "IPv6ProcessedBytes", "IPv6RequestCount", "NewConnectionCount", "ProcessedBytes", "RejectedConnectionCount", "RequestCount", "RequestCountPerTarget", "TargetConnectionErrorCount", "TargetResponseTime", "TargetTLSNegotiationErrorCount", "UnHealthyHostCount"}, "AWS/ApplicationELB": {"ActiveConnectionCount", "ClientTLSNegotiationErrorCount", "HealthyHostCount", "HTTPCode_ELB_4XX_Count", "HTTPCode_ELB_5XX_Count", "HTTPCode_Target_2XX_Count", "HTTPCode_Target_3XX_Count", "HTTPCode_Target_4XX_Count", "HTTPCode_Target_5XX_Count", "IPv6ProcessedBytes", "IPv6RequestCount", "NewConnectionCount", "ProcessedBytes", "RejectedConnectionCount", "RequestCount", "RequestCountPerTarget", "TargetConnectionErrorCount", "TargetResponseTime", "TargetTLSNegotiationErrorCount", "UnHealthyHostCount"},
"AWS/AutoScaling": {"GroupMinSize", "GroupMaxSize", "GroupDesiredCapacity", "GroupInServiceInstances", "GroupPendingInstances", "GroupStandbyInstances", "GroupTerminatingInstances", "GroupTotalInstances"}, "AWS/AutoScaling": {"GroupMinSize", "GroupMaxSize", "GroupDesiredCapacity", "GroupInServiceInstances", "GroupPendingInstances", "GroupStandbyInstances", "GroupTerminatingInstances", "GroupTotalInstances"},
@ -106,6 +107,7 @@ func init() {
"KMS": {"SecondsUntilKeyMaterialExpiration"}, "KMS": {"SecondsUntilKeyMaterialExpiration"},
} }
dimensionsMap = map[string][]string{ dimensionsMap = map[string][]string{
"AWS/AmazonMQ": {"Broker", "Topic", "Queue"},
"AWS/ApiGateway": {"ApiName", "Method", "Resource", "Stage"}, "AWS/ApiGateway": {"ApiName", "Method", "Resource", "Stage"},
"AWS/ApplicationELB": {"LoadBalancer", "TargetGroup", "AvailabilityZone"}, "AWS/ApplicationELB": {"LoadBalancer", "TargetGroup", "AvailabilityZone"},
"AWS/AutoScaling": {"AutoScalingGroupName"}, "AWS/AutoScaling": {"AutoScalingGroupName"},

View File

@ -83,41 +83,48 @@ func (e *PrometheusExecutor) getClient(dsInfo *models.DataSource) (apiv1.API, er
} }
func (e *PrometheusExecutor) Query(ctx context.Context, dsInfo *models.DataSource, tsdbQuery *tsdb.TsdbQuery) (*tsdb.Response, error) { func (e *PrometheusExecutor) Query(ctx context.Context, dsInfo *models.DataSource, tsdbQuery *tsdb.TsdbQuery) (*tsdb.Response, error) {
result := &tsdb.Response{} result := &tsdb.Response{
Results: map[string]*tsdb.QueryResult{},
}
client, err := e.getClient(dsInfo) client, err := e.getClient(dsInfo)
if err != nil { if err != nil {
return nil, err return nil, err
} }
query, err := parseQuery(dsInfo, tsdbQuery.Queries, tsdbQuery) querys, err := parseQuery(dsInfo, tsdbQuery.Queries, tsdbQuery)
if err != nil { if err != nil {
return nil, err return nil, err
} }
timeRange := apiv1.Range{ for _, query := range querys {
Start: query.Start, timeRange := apiv1.Range{
End: query.End, Start: query.Start,
Step: query.Step, End: query.End,
Step: query.Step,
}
plog.Debug("Sending query", "start", timeRange.Start, "end", timeRange.End, "step", timeRange.Step, "query", query.Expr)
span, ctx := opentracing.StartSpanFromContext(ctx, "alerting.prometheus")
span.SetTag("expr", query.Expr)
span.SetTag("start_unixnano", int64(query.Start.UnixNano()))
span.SetTag("stop_unixnano", int64(query.End.UnixNano()))
defer span.Finish()
value, err := client.QueryRange(ctx, query.Expr, timeRange)
if err != nil {
return nil, err
}
queryResult, err := parseResponse(value, query)
if err != nil {
return nil, err
}
result.Results[query.RefId] = queryResult
} }
span, ctx := opentracing.StartSpanFromContext(ctx, "alerting.prometheus")
span.SetTag("expr", query.Expr)
span.SetTag("start_unixnano", int64(query.Start.UnixNano()))
span.SetTag("stop_unixnano", int64(query.End.UnixNano()))
defer span.Finish()
value, err := client.QueryRange(ctx, query.Expr, timeRange)
if err != nil {
return nil, err
}
queryResult, err := parseResponse(value, query)
if err != nil {
return nil, err
}
result.Results = queryResult
return result, nil return result, nil
} }
@ -140,51 +147,54 @@ func formatLegend(metric model.Metric, query *PrometheusQuery) string {
return string(result) return string(result)
} }
func parseQuery(dsInfo *models.DataSource, queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) (*PrometheusQuery, error) { func parseQuery(dsInfo *models.DataSource, queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) ([]*PrometheusQuery, error) {
queryModel := queries[0] qs := []*PrometheusQuery{}
for _, queryModel := range queries {
expr, err := queryModel.Model.Get("expr").String()
if err != nil {
return nil, err
}
expr, err := queryModel.Model.Get("expr").String() format := queryModel.Model.Get("legendFormat").MustString("")
if err != nil {
return nil, err start, err := queryContext.TimeRange.ParseFrom()
if err != nil {
return nil, err
}
end, err := queryContext.TimeRange.ParseTo()
if err != nil {
return nil, err
}
dsInterval, err := tsdb.GetIntervalFrom(dsInfo, queryModel.Model, time.Second*15)
if err != nil {
return nil, err
}
intervalFactor := queryModel.Model.Get("intervalFactor").MustInt64(1)
interval := intervalCalculator.Calculate(queryContext.TimeRange, dsInterval)
step := time.Duration(int64(interval.Value) * intervalFactor)
qs = append(qs, &PrometheusQuery{
Expr: expr,
Step: step,
LegendFormat: format,
Start: start,
End: end,
RefId: queryModel.RefId,
})
} }
format := queryModel.Model.Get("legendFormat").MustString("") return qs, nil
start, err := queryContext.TimeRange.ParseFrom()
if err != nil {
return nil, err
}
end, err := queryContext.TimeRange.ParseTo()
if err != nil {
return nil, err
}
dsInterval, err := tsdb.GetIntervalFrom(dsInfo, queryModel.Model, time.Second*15)
if err != nil {
return nil, err
}
intervalFactor := queryModel.Model.Get("intervalFactor").MustInt64(1)
interval := intervalCalculator.Calculate(queryContext.TimeRange, dsInterval)
step := time.Duration(int64(interval.Value) * intervalFactor)
return &PrometheusQuery{
Expr: expr,
Step: step,
LegendFormat: format,
Start: start,
End: end,
}, nil
} }
func parseResponse(value model.Value, query *PrometheusQuery) (map[string]*tsdb.QueryResult, error) { func parseResponse(value model.Value, query *PrometheusQuery) (*tsdb.QueryResult, error) {
queryResults := make(map[string]*tsdb.QueryResult)
queryRes := tsdb.NewQueryResult() queryRes := tsdb.NewQueryResult()
data, ok := value.(model.Matrix) data, ok := value.(model.Matrix)
if !ok { if !ok {
return queryResults, fmt.Errorf("Unsupported result format: %s", value.Type().String()) return queryRes, fmt.Errorf("Unsupported result format: %s", value.Type().String())
} }
for _, v := range data { for _, v := range data {
@ -204,6 +214,5 @@ func parseResponse(value model.Value, query *PrometheusQuery) (map[string]*tsdb.
queryRes.Series = append(queryRes.Series, &series) queryRes.Series = append(queryRes.Series, &series)
} }
queryResults["A"] = queryRes return queryRes, nil
return queryResults, nil
} }

View File

@ -60,9 +60,10 @@ func TestPrometheus(t *testing.T) {
Convey("with 48h time range", func() { Convey("with 48h time range", func() {
queryContext.TimeRange = tsdb.NewTimeRange("12h", "now") queryContext.TimeRange = tsdb.NewTimeRange("12h", "now")
model, err := parseQuery(dsInfo, queryModels, queryContext) models, err := parseQuery(dsInfo, queryModels, queryContext)
So(err, ShouldBeNil) So(err, ShouldBeNil)
model := models[0]
So(model.Step, ShouldEqual, time.Second*30) So(model.Step, ShouldEqual, time.Second*30)
}) })
}) })
@ -83,18 +84,22 @@ func TestPrometheus(t *testing.T) {
Convey("with 48h time range", func() { Convey("with 48h time range", func() {
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now") queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
model, err := parseQuery(dsInfo, queryModels, queryContext) models, err := parseQuery(dsInfo, queryModels, queryContext)
So(err, ShouldBeNil) So(err, ShouldBeNil)
model := models[0]
So(model.Step, ShouldEqual, time.Minute*2) So(model.Step, ShouldEqual, time.Minute*2)
}) })
Convey("with 1h time range", func() { Convey("with 1h time range", func() {
queryContext.TimeRange = tsdb.NewTimeRange("1h", "now") queryContext.TimeRange = tsdb.NewTimeRange("1h", "now")
model, err := parseQuery(dsInfo, queryModels, queryContext) models, err := parseQuery(dsInfo, queryModels, queryContext)
So(err, ShouldBeNil) So(err, ShouldBeNil)
model := models[0]
So(model.Step, ShouldEqual, time.Second*15) So(model.Step, ShouldEqual, time.Second*15)
}) })
}) })
@ -116,9 +121,11 @@ func TestPrometheus(t *testing.T) {
Convey("with 48h time range", func() { Convey("with 48h time range", func() {
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now") queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
model, err := parseQuery(dsInfo, queryModels, queryContext) models, err := parseQuery(dsInfo, queryModels, queryContext)
So(err, ShouldBeNil) So(err, ShouldBeNil)
model := models[0]
So(model.Step, ShouldEqual, time.Minute*20) So(model.Step, ShouldEqual, time.Minute*20)
}) })
}) })
@ -139,9 +146,11 @@ func TestPrometheus(t *testing.T) {
Convey("with 48h time range", func() { Convey("with 48h time range", func() {
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now") queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
model, err := parseQuery(dsInfo, queryModels, queryContext) models, err := parseQuery(dsInfo, queryModels, queryContext)
So(err, ShouldBeNil) So(err, ShouldBeNil)
model := models[0]
So(model.Step, ShouldEqual, time.Minute*2) So(model.Step, ShouldEqual, time.Minute*2)
}) })
}) })

View File

@ -8,4 +8,5 @@ type PrometheusQuery struct {
LegendFormat string LegendFormat string
Start time.Time Start time.Time
End time.Time End time.Time
RefId string
} }

View File

@ -27,6 +27,9 @@ _.move = function(array, fromIndex, toIndex) {
}; };
import { coreModule, registerAngularDirectives } from './core/core'; import { coreModule, registerAngularDirectives } from './core/core';
import { setupAngularRoutes } from './routes/routes';
declare var System: any;
export class GrafanaApp { export class GrafanaApp {
registerFunctions: any; registerFunctions: any;
@ -54,49 +57,40 @@ export class GrafanaApp {
moment.locale(config.bootData.user.locale); moment.locale(config.bootData.user.locale);
app.config( app.config(($locationProvider, $controllerProvider, $compileProvider, $filterProvider, $httpProvider, $provide) => {
( // pre assing bindings before constructor calls
$locationProvider, $compileProvider.preAssignBindingsEnabled(true);
$controllerProvider,
$compileProvider,
$filterProvider,
$httpProvider,
$provide
) => {
// pre assing bindings before constructor calls
$compileProvider.preAssignBindingsEnabled(true);
if (config.buildInfo.env !== 'development') { if (config.buildInfo.env !== 'development') {
$compileProvider.debugInfoEnabled(false); $compileProvider.debugInfoEnabled(false);
}
$httpProvider.useApplyAsync(true);
this.registerFunctions.controller = $controllerProvider.register;
this.registerFunctions.directive = $compileProvider.directive;
this.registerFunctions.factory = $provide.factory;
this.registerFunctions.service = $provide.service;
this.registerFunctions.filter = $filterProvider.register;
$provide.decorator('$http', [
'$delegate',
'$templateCache',
function($delegate, $templateCache) {
var get = $delegate.get;
$delegate.get = function(url, config) {
if (url.match(/\.html$/)) {
// some template's already exist in the cache
if (!$templateCache.get(url)) {
url += '?v=' + new Date().getTime();
}
}
return get(url, config);
};
return $delegate;
},
]);
} }
);
$httpProvider.useApplyAsync(true);
this.registerFunctions.controller = $controllerProvider.register;
this.registerFunctions.directive = $compileProvider.directive;
this.registerFunctions.factory = $provide.factory;
this.registerFunctions.service = $provide.service;
this.registerFunctions.filter = $filterProvider.register;
$provide.decorator('$http', [
'$delegate',
'$templateCache',
function($delegate, $templateCache) {
var get = $delegate.get;
$delegate.get = function(url, config) {
if (url.match(/\.html$/)) {
// some template's already exist in the cache
if (!$templateCache.get(url)) {
url += '?v=' + new Date().getTime();
}
}
return get(url, config);
};
return $delegate;
},
]);
});
this.ngModuleDependencies = [ this.ngModuleDependencies = [
'grafana.core', 'grafana.core',
@ -111,14 +105,7 @@ export class GrafanaApp {
'react', 'react',
]; ];
var module_types = [ var module_types = ['controllers', 'directives', 'factories', 'services', 'filters', 'routes'];
'controllers',
'directives',
'factories',
'services',
'filters',
'routes',
];
_.each(module_types, type => { _.each(module_types, type => {
var moduleName = 'grafana.' + type; var moduleName = 'grafana.' + type;
@ -129,6 +116,7 @@ export class GrafanaApp {
this.useModule(coreModule); this.useModule(coreModule);
// register react angular wrappers // register react angular wrappers
coreModule.config(setupAngularRoutes);
registerAngularDirectives(); registerAngularDirectives();
var preBootRequires = [System.import('app/features/all')]; var preBootRequires = [System.import('app/features/all')];
@ -137,6 +125,7 @@ export class GrafanaApp {
.then(() => { .then(() => {
// disable tool tip animation // disable tool tip animation
$.fn.tooltip.defaults.animation = false; $.fn.tooltip.defaults.animation = false;
// bootstrap the app // bootstrap the app
angular.bootstrap(document, this.ngModuleDependencies).invoke(() => { angular.bootstrap(document, this.ngModuleDependencies).invoke(() => {
_.each(this.preBootModules, module => { _.each(this.preBootModules, module => {

View File

@ -0,0 +1,68 @@
import React from 'react';
import moment from 'moment';
import { AlertRuleList } from './AlertRuleList';
import { RootStore } from 'app/stores/RootStore/RootStore';
import { backendSrv, createNavTree } from 'test/mocks/common';
import { mount } from 'enzyme';
import toJson from 'enzyme-to-json';
describe('AlertRuleList', () => {
let page, store;
beforeAll(() => {
backendSrv.get.mockReturnValue(
Promise.resolve([
{
id: 11,
dashboardId: 58,
panelId: 3,
name: 'Panel Title alert',
state: 'ok',
newStateDate: moment()
.subtract(5, 'minutes')
.format(),
evalData: {},
executionError: '',
dashboardUri: 'db/mygool',
},
])
);
store = RootStore.create(
{},
{
backendSrv: backendSrv,
navTree: createNavTree('alerting', 'alert-list'),
}
);
page = mount(<AlertRuleList {...store} />);
});
it('should call api to get rules', () => {
expect(backendSrv.get.mock.calls[0][0]).toEqual('/api/alerts');
});
it('should render 1 rule', () => {
page.update();
let ruleNode = page.find('.alert-rule-item');
expect(toJson(ruleNode)).toMatchSnapshot();
});
it('toggle state should change pause rule if not paused', async () => {
backendSrv.post.mockReturnValue(
Promise.resolve({
state: 'paused',
})
);
page.find('.fa-pause').simulate('click');
// wait for api call to resolve
await Promise.resolve();
page.update();
expect(store.alertList.rules[0].state).toBe('paused');
expect(page.find('.fa-play')).toHaveLength(1);
});
});

View File

@ -0,0 +1,174 @@
import React from 'react';
import classNames from 'classnames';
import { inject, observer } from 'mobx-react';
import PageHeader from 'app/core/components/PageHeader/PageHeader';
import { IAlertRule } from 'app/stores/AlertListStore/AlertListStore';
import appEvents from 'app/core/app_events';
import IContainerProps from 'app/containers/IContainerProps';
import Highlighter from 'react-highlight-words';
@inject('view', 'nav', 'alertList')
@observer
export class AlertRuleList extends React.Component<IContainerProps, any> {
stateFilters = [
{ text: 'All', value: 'all' },
{ text: 'OK', value: 'ok' },
{ text: 'Not OK', value: 'not_ok' },
{ text: 'Alerting', value: 'alerting' },
{ text: 'No Data', value: 'no_data' },
{ text: 'Paused', value: 'paused' },
];
constructor(props) {
super(props);
this.props.nav.load('alerting', 'alert-list');
this.fetchRules();
}
onStateFilterChanged = evt => {
this.props.view.updateQuery({ state: evt.target.value });
this.fetchRules();
};
fetchRules() {
this.props.alertList.loadRules({
state: this.props.view.query.get('state') || 'all',
});
}
onOpenHowTo = () => {
appEvents.emit('show-modal', {
src: 'public/app/features/alerting/partials/alert_howto.html',
modalClass: 'confirm-modal',
model: {},
});
};
onSearchQueryChange = evt => {
this.props.alertList.setSearchQuery(evt.target.value);
};
render() {
const { nav, alertList } = this.props;
return (
<div>
<PageHeader model={nav as any} />
<div className="page-container page-body">
<div className="page-action-bar">
<div className="gf-form gf-form--grow">
<label className="gf-form--has-input-icon gf-form--grow">
<input
type="text"
className="gf-form-input"
placeholder="Search alerts"
value={alertList.search}
onChange={this.onSearchQueryChange}
/>
<i className="gf-form-input-icon fa fa-search" />
</label>
</div>
<div className="gf-form">
<label className="gf-form-label">States</label>
<div className="gf-form-select-wrapper width-13">
<select className="gf-form-input" onChange={this.onStateFilterChanged} value={alertList.stateFilter}>
{this.stateFilters.map(AlertStateFilterOption)}
</select>
</div>
</div>
<div className="page-action-bar__spacer" />
<a className="btn btn-secondary" onClick={this.onOpenHowTo}>
<i className="fa fa-info-circle" /> How to add an alert
</a>
</div>
<section>
<ol className="alert-rule-list">
{alertList.filteredRules.map(rule => (
<AlertRuleItem rule={rule} key={rule.id} search={alertList.search} />
))}
</ol>
</section>
</div>
</div>
);
}
}
function AlertStateFilterOption({ text, value }) {
return (
<option key={value} value={value}>
{text}
</option>
);
}
export interface AlertRuleItemProps {
rule: IAlertRule;
search: string;
}
@observer
export class AlertRuleItem extends React.Component<AlertRuleItemProps, any> {
toggleState = () => {
this.props.rule.togglePaused();
};
renderText(text: string) {
return (
<Highlighter
highlightClassName="highlight-search-match"
textToHighlight={text}
searchWords={[this.props.search]}
/>
);
}
render() {
const { rule } = this.props;
let stateClass = classNames({
fa: true,
'fa-play': rule.isPaused,
'fa-pause': !rule.isPaused,
});
let ruleUrl = `dashboard/${rule.dashboardUri}?panelId=${rule.panelId}&fullscreen&edit&tab=alert`;
return (
<li className="alert-rule-item">
<span className={`alert-rule-item__icon ${rule.stateClass}`}>
<i className={rule.stateIcon} />
</span>
<div className="alert-rule-item__body">
<div className="alert-rule-item__header">
<div className="alert-rule-item__name">
<a href={ruleUrl}>{this.renderText(rule.name)}</a>
</div>
<div className="alert-rule-item__text">
<span className={`${rule.stateClass}`}>{this.renderText(rule.stateText)}</span>
<span className="alert-rule-item__time"> for {rule.stateAge}</span>
</div>
</div>
{rule.info && <div className="small muted alert-rule-item__info">{this.renderText(rule.info)}</div>}
</div>
<div className="alert-rule-item__actions">
<a
className="btn btn-small btn-inverse alert-list__btn width-2"
title="Pausing an alert rule prevents it from executing"
onClick={this.toggleState}
>
<i className={stateClass} />
</a>
<a className="btn btn-small btn-inverse alert-list__btn width-2" href={ruleUrl} title="Edit alert rule">
<i className="icon-gf icon-gf-settings" />
</a>
</div>
</li>
);
}
}

View File

@ -0,0 +1,103 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`AlertRuleList should render 1 rule 1`] = `
<li
className="alert-rule-item"
>
<span
className="alert-rule-item__icon alert-state-ok"
>
<i
className="icon-gf icon-gf-online"
/>
</span>
<div
className="alert-rule-item__body"
>
<div
className="alert-rule-item__header"
>
<div
className="alert-rule-item__name"
>
<a
href="dashboard/db/mygool?panelId=3&fullscreen&edit&tab=alert"
>
<Highlighter
highlightClassName="highlight-search-match"
searchWords={
Array [
"",
]
}
textToHighlight="Panel Title alert"
>
<span>
<span
className=""
key="0"
>
Panel Title alert
</span>
</span>
</Highlighter>
</a>
</div>
<div
className="alert-rule-item__text"
>
<span
className="alert-state-ok"
>
<Highlighter
highlightClassName="highlight-search-match"
searchWords={
Array [
"",
]
}
textToHighlight="OK"
>
<span>
<span
className=""
key="0"
>
OK
</span>
</span>
</Highlighter>
</span>
<span
className="alert-rule-item__time"
>
for
5 minutes
</span>
</div>
</div>
</div>
<div
className="alert-rule-item__actions"
>
<a
className="btn btn-small btn-inverse alert-list__btn width-2"
onClick={[Function]}
title="Pausing an alert rule prevents it from executing"
>
<i
className="fa fa-pause"
/>
</a>
<a
className="btn btn-small btn-inverse alert-list__btn width-2"
href="dashboard/db/mygool?panelId=3&fullscreen&edit&tab=alert"
title="Edit alert rule"
>
<i
className="icon-gf icon-gf-settings"
/>
</a>
</div>
</li>
`;

View File

@ -0,0 +1,15 @@
import { SearchStore } from './../stores/SearchStore/SearchStore';
import { ServerStatsStore } from './../stores/ServerStatsStore/ServerStatsStore';
import { NavStore } from './../stores/NavStore/NavStore';
import { AlertListStore } from './../stores/AlertListStore/AlertListStore';
import { ViewStore } from './../stores/ViewStore/ViewStore';
interface IContainerProps {
search: typeof SearchStore.Type;
serverStats: typeof ServerStatsStore.Type;
nav: typeof NavStore.Type;
alertList: typeof AlertListStore.Type;
view: typeof ViewStore.Type;
}
export default IContainerProps;

View File

@ -0,0 +1,30 @@
import React from 'react';
import renderer from 'react-test-renderer';
import { ServerStats } from './ServerStats';
import { RootStore } from 'app/stores/RootStore/RootStore';
import { backendSrv, createNavTree } from 'test/mocks/common';
describe('ServerStats', () => {
it('Should render table with stats', done => {
backendSrv.get.mockReturnValue(
Promise.resolve({
dashboards: 10,
})
);
const store = RootStore.create(
{},
{
backendSrv: backendSrv,
navTree: createNavTree('cfg', 'admin', 'server-stats'),
}
);
const page = renderer.create(<ServerStats {...store} />);
setTimeout(() => {
expect(page.toJSON()).toMatchSnapshot();
done();
});
});
});

View File

@ -0,0 +1,45 @@
import React from 'react';
import { inject, observer } from 'mobx-react';
import PageHeader from 'app/core/components/PageHeader/PageHeader';
import IContainerProps from 'app/containers/IContainerProps';
@inject('nav', 'serverStats')
@observer
export class ServerStats extends React.Component<IContainerProps, any> {
constructor(props) {
super(props);
const { nav, serverStats } = this.props;
nav.load('cfg', 'admin', 'server-stats');
serverStats.load();
}
render() {
const { nav, serverStats } = this.props;
return (
<div>
<PageHeader model={nav as any} />
<div className="page-container page-body">
<table className="filter-table form-inline">
<thead>
<tr>
<th>Name</th>
<th>Value</th>
</tr>
</thead>
<tbody>{serverStats.stats.map(StatItem)}</tbody>
</table>
</div>
</div>
);
}
}
function StatItem(stat) {
return (
<tr key={stat.name}>
<td>{stat.name}</td>
<td>{stat.value}</td>
</tr>
);
}

View File

@ -0,0 +1,170 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`ServerStats Should render table with stats 1`] = `
<div>
<div
className="page-header-canvas"
>
<div
className="page-container"
>
<div
className="page-header"
>
<div
className="page-header__inner"
>
<span
className="page-header__logo"
>
</span>
<div
className="page-header__info-block"
>
<h1
className="page-header__title"
>
admin-Text
</h1>
</div>
</div>
<nav>
<div
className="gf-form-select-wrapper width-20 page-header__select-nav"
>
<label
className="gf-form-select-icon "
htmlFor="page-header-select-nav"
/>
<select
className="gf-select-nav gf-form-input"
defaultValue="/url/server-stats"
id="page-header-select-nav"
onChange={[Function]}
>
<option
value="/url/server-stats"
>
server-stats-Text
</option>
</select>
</div>
<ul
className="gf-tabs page-header__tabs"
>
<li
className="gf-tabs-item"
>
<a
className="gf-tabs-link active"
href="/url/server-stats"
target={undefined}
>
<i
className=""
/>
server-stats-Text
</a>
</li>
</ul>
</nav>
</div>
</div>
</div>
<div
className="page-container page-body"
>
<table
className="filter-table form-inline"
>
<thead>
<tr>
<th>
Name
</th>
<th>
Value
</th>
</tr>
</thead>
<tbody>
<tr>
<td>
Total dashboards
</td>
<td>
10
</td>
</tr>
<tr>
<td>
Total users
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Active users (seen last 30 days)
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total orgs
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total playlists
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total snapshots
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total dashboard tags
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total starred dashboards
</td>
<td>
0
</td>
</tr>
<tr>
<td>
Total alerts
</td>
<td>
0
</td>
</tr>
</tbody>
</table>
</div>
</div>
`;

View File

@ -3,10 +3,14 @@ import { PasswordStrength } from './components/PasswordStrength';
import PageHeader from './components/PageHeader/PageHeader'; import PageHeader from './components/PageHeader/PageHeader';
import EmptyListCTA from './components/EmptyListCTA/EmptyListCTA'; import EmptyListCTA from './components/EmptyListCTA/EmptyListCTA';
import LoginBackground from './components/Login/LoginBackground'; import LoginBackground from './components/Login/LoginBackground';
import { SearchResult } from './components/search/SearchResult';
import UserPicker from './components/UserPicker/UserPicker';
export function registerAngularDirectives() { export function registerAngularDirectives() {
react2AngularDirective('passwordStrength', PasswordStrength, ['password']); react2AngularDirective('passwordStrength', PasswordStrength, ['password']);
react2AngularDirective('pageHeader', PageHeader, ['model', 'noTabs']); react2AngularDirective('pageHeader', PageHeader, ['model', 'noTabs']);
react2AngularDirective('emptyListCta', EmptyListCTA, ['model']); react2AngularDirective('emptyListCta', EmptyListCTA, ['model']);
react2AngularDirective('loginBackground', LoginBackground, []); react2AngularDirective('loginBackground', LoginBackground, []);
react2AngularDirective('searchResult', SearchResult, []);
react2AngularDirective('selectUserPicker', UserPicker, ['backendSrv', 'teamId', 'refreshList']);
} }

View File

@ -3,19 +3,18 @@ import renderer from 'react-test-renderer';
import EmptyListCTA from './EmptyListCTA'; import EmptyListCTA from './EmptyListCTA';
const model = { const model = {
title: 'Title', title: 'Title',
buttonIcon: 'ga css class', buttonIcon: 'ga css class',
buttonLink: 'http://url/to/destination', buttonLink: 'http://url/to/destination',
buttonTitle: 'Click me', buttonTitle: 'Click me',
proTip: 'This is a tip', proTip: 'This is a tip',
proTipLink: 'http://url/to/tip/destination', proTipLink: 'http://url/to/tip/destination',
proTipLinkTitle: 'Learn more', proTipLinkTitle: 'Learn more',
proTipTarget: '_blank' proTipTarget: '_blank',
}; };
describe('CollorPalette', () => { describe('EmptyListCTA', () => {
it('renders correctly', () => {
it('renders correctly', () => {
const tree = renderer.create(<EmptyListCTA model={model} />).toJSON(); const tree = renderer.create(<EmptyListCTA model={model} />).toJSON();
expect(tree).toMatchSnapshot(); expect(tree).toMatchSnapshot();
}); });

View File

@ -1,6 +1,6 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP // Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`CollorPalette renders correctly 1`] = ` exports[`EmptyListCTA renders correctly 1`] = `
<div <div
className="empty-list-cta" className="empty-list-cta"
> >

View File

@ -1,7 +1,7 @@
import React from "react"; import React from 'react';
import { NavModel, NavModelItem } from "../../nav_model_srv"; import { NavModel, NavModelItem } from '../../nav_model_srv';
import classNames from "classnames"; import classNames from 'classnames';
import appEvents from "app/core/app_events"; import appEvents from 'app/core/app_events';
export interface IProps { export interface IProps {
model: NavModel; model: NavModel;
@ -13,8 +13,8 @@ function TabItem(tab: NavModelItem) {
} }
let tabClasses = classNames({ let tabClasses = classNames({
"gf-tabs-link": true, 'gf-tabs-link': true,
active: tab.active active: tab.active,
}); });
return ( return (
@ -49,13 +49,7 @@ function Navigation({ main }: { main: NavModelItem }) {
); );
} }
function SelectNav({ function SelectNav({ main, customCss }: { main: NavModelItem; customCss: string }) {
main,
customCss
}: {
main: NavModelItem;
customCss: string;
}) {
const defaultSelectedItem = main.children.find(navItem => { const defaultSelectedItem = main.children.find(navItem => {
return navItem.active === true; return navItem.active === true;
}); });
@ -63,15 +57,12 @@ function SelectNav({
const gotoUrl = evt => { const gotoUrl = evt => {
var element = evt.target; var element = evt.target;
var url = element.options[element.selectedIndex].value; var url = element.options[element.selectedIndex].value;
appEvents.emit("location-change", { href: url }); appEvents.emit('location-change', { href: url });
}; };
return ( return (
<div className={`gf-form-select-wrapper width-20 ${customCss}`}> <div className={`gf-form-select-wrapper width-20 ${customCss}`}>
<label <label className={`gf-form-select-icon ${defaultSelectedItem.icon}`} htmlFor="page-header-select-nav" />
className={`gf-form-select-icon ${defaultSelectedItem.icon}`}
htmlFor="page-header-select-nav"
/>
{/* Label to make it clickable */} {/* Label to make it clickable */}
<select <select
className="gf-select-nav gf-form-input" className="gf-select-nav gf-form-input"
@ -86,9 +77,7 @@ function SelectNav({
} }
function Tabs({ main, customCss }: { main: NavModelItem; customCss: string }) { function Tabs({ main, customCss }: { main: NavModelItem; customCss: string }) {
return ( return <ul className={`gf-tabs ${customCss}`}>{main.children.map(TabItem)}</ul>;
<ul className={`gf-tabs ${customCss}`}>{main.children.map(TabItem)}</ul>
);
} }
export default class PageHeader extends React.Component<IProps, any> { export default class PageHeader extends React.Component<IProps, any> {
@ -125,13 +114,9 @@ export default class PageHeader extends React.Component<IProps, any> {
{main.text && <h1 className="page-header__title">{main.text}</h1>} {main.text && <h1 className="page-header__title">{main.text}</h1>}
{main.breadcrumbs && {main.breadcrumbs &&
main.breadcrumbs.length > 0 && ( main.breadcrumbs.length > 0 && (
<h1 className="page-header__title"> <h1 className="page-header__title">{this.renderBreadcrumb(main.breadcrumbs)}</h1>
{this.renderBreadcrumb(main.breadcrumbs)}
</h1>
)} )}
{main.subTitle && ( {main.subTitle && <div className="page-header__sub-title">{main.subTitle}</div>}
<div className="page-header__sub-title">{main.subTitle}</div>
)}
{main.subType && ( {main.subType && (
<div className="page-header__stamps"> <div className="page-header__stamps">
<i className={main.subType.icon} /> <i className={main.subType.icon} />

View File

@ -0,0 +1,16 @@
import React from 'react';
import renderer from 'react-test-renderer';
import Popover from './Popover';
describe('Popover', () => {
it('renders correctly', () => {
const tree = renderer
.create(
<Popover placement="auto" content="Popover text">
<button>Button with Popover</button>
</Popover>
)
.toJSON();
expect(tree).toMatchSnapshot();
});
});

View File

@ -0,0 +1,34 @@
import React from 'react';
import withTooltip from './withTooltip';
import { Target } from 'react-popper';
interface IPopoverProps {
tooltipSetState: (prevState: object) => void;
}
class Popover extends React.Component<IPopoverProps, any> {
constructor(props) {
super(props);
this.toggleTooltip = this.toggleTooltip.bind(this);
}
toggleTooltip() {
const { tooltipSetState } = this.props;
tooltipSetState(prevState => {
return {
...prevState,
show: !prevState.show,
};
});
}
render() {
return (
<Target className="popper__target" onClick={this.toggleTooltip}>
{this.props.children}
</Target>
);
}
}
export default withTooltip(Popover);

View File

@ -0,0 +1,16 @@
import React from 'react';
import renderer from 'react-test-renderer';
import Tooltip from './Tooltip';
describe('Tooltip', () => {
it('renders correctly', () => {
const tree = renderer
.create(
<Tooltip placement="auto" content="Tooltip text">
<a href="http://www.grafana.com">Link with tooltip</a>
</Tooltip>
)
.toJSON();
expect(tree).toMatchSnapshot();
});
});

View File

@ -0,0 +1,45 @@
import React from 'react';
import withTooltip from './withTooltip';
import { Target } from 'react-popper';
interface ITooltipProps {
tooltipSetState: (prevState: object) => void;
}
class Tooltip extends React.Component<ITooltipProps, any> {
constructor(props) {
super(props);
this.showTooltip = this.showTooltip.bind(this);
this.hideTooltip = this.hideTooltip.bind(this);
}
showTooltip() {
const { tooltipSetState } = this.props;
tooltipSetState(prevState => {
return {
...prevState,
show: true,
};
});
}
hideTooltip() {
const { tooltipSetState } = this.props;
tooltipSetState(prevState => {
return {
...prevState,
show: false,
};
});
}
render() {
return (
<Target className="popper__target" onMouseOver={this.showTooltip} onMouseOut={this.hideTooltip}>
{this.props.children}
</Target>
);
}
}
export default withTooltip(Tooltip);

View File

@ -0,0 +1,16 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`Popover renders correctly 1`] = `
<div
className="popper__manager"
>
<div
className="popper__target"
onClick={[Function]}
>
<button>
Button with Popover
</button>
</div>
</div>
`;

View File

@ -0,0 +1,19 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`Tooltip renders correctly 1`] = `
<div
className="popper__manager"
>
<div
className="popper__target"
onMouseOut={[Function]}
onMouseOver={[Function]}
>
<a
href="http://www.grafana.com"
>
Link with tooltip
</a>
</div>
</div>
`;

View File

@ -0,0 +1,57 @@
import React from 'react';
import { Manager, Popper, Arrow } from 'react-popper';
interface IwithTooltipProps {
placement?: string;
content: string | ((props: any) => JSX.Element);
}
export default function withTooltip(WrappedComponent) {
return class extends React.Component<IwithTooltipProps, any> {
constructor(props) {
super(props);
this.setState = this.setState.bind(this);
this.state = {
placement: this.props.placement || 'auto',
show: false,
};
}
componentWillReceiveProps(nextProps) {
if (nextProps.placement && nextProps.placement !== this.state.placement) {
this.setState(prevState => {
return {
...prevState,
placement: nextProps.placement,
};
});
}
}
renderContent(content) {
if (typeof content === 'function') {
// If it's a function we assume it's a React component
const ReactComponent = content;
return <ReactComponent />;
}
return content;
}
render() {
const { content } = this.props;
return (
<Manager className="popper__manager">
<WrappedComponent {...this.props} tooltipSetState={this.setState} />
{this.state.show ? (
<Popper placement={this.state.placement} className="popper">
{this.renderContent(content)}
<Arrow className="popper__arrow" />
</Popper>
) : null}
</Manager>
);
}
};
}

View File

@ -0,0 +1,20 @@
import React from 'react';
import renderer from 'react-test-renderer';
import UserPicker from './UserPicker';
const model = {
backendSrv: {
get: () => {
return new Promise((resolve, reject) => {});
},
},
refreshList: () => {},
teamId: '1',
};
describe('UserPicker', () => {
it('renders correctly', () => {
const tree = renderer.create(<UserPicker {...model} />).toJSON();
expect(tree).toMatchSnapshot();
});
});

View File

@ -0,0 +1,108 @@
import React, { Component } from 'react';
import { debounce } from 'lodash';
import Select from 'react-select';
import UserPickerOption from './UserPickerOption';
export interface IProps {
backendSrv: any;
teamId: string;
refreshList: any;
}
export interface User {
id: number;
name: string;
login: string;
email: string;
}
class UserPicker extends Component<IProps, any> {
debouncedSearchUsers: any;
backendSrv: any;
teamId: string;
refreshList: any;
constructor(props) {
super(props);
this.backendSrv = this.props.backendSrv;
this.teamId = this.props.teamId;
this.refreshList = this.props.refreshList;
this.searchUsers = this.searchUsers.bind(this);
this.handleChange = this.handleChange.bind(this);
this.addUser = this.addUser.bind(this);
this.toggleLoading = this.toggleLoading.bind(this);
this.debouncedSearchUsers = debounce(this.searchUsers, 300, {
leading: true,
trailing: false,
});
this.state = {
multi: false,
isLoading: false,
};
}
handleChange(user) {
this.addUser(user.id);
}
toggleLoading(isLoading) {
this.setState(prevState => {
return {
...prevState,
isLoading: isLoading,
};
});
}
addUser(userId) {
this.toggleLoading(true);
this.backendSrv.post(`/api/teams/${this.teamId}/members`, { userId: userId }).then(() => {
this.refreshList();
this.toggleLoading(false);
});
}
searchUsers(query) {
this.toggleLoading(true);
return this.backendSrv.get(`/api/users/search?perpage=10&page=1&query=${query}`).then(result => {
const users = result.users.map(user => {
return {
id: user.id,
label: `${user.login} - ${user.email}`,
avatarUrl: user.avatarUrl,
};
});
this.toggleLoading(false);
return { options: users };
});
}
render() {
const AsyncComponent = this.state.creatable ? Select.AsyncCreatable : Select.Async;
return (
<div className="user-picker">
<AsyncComponent
valueKey="id"
multi={this.state.multi}
labelKey="label"
cache={false}
isLoading={this.state.isLoading}
loadOptions={this.debouncedSearchUsers}
loadingPlaceholder="Loading..."
noResultsText="No users found"
onChange={this.handleChange}
className="width-8 gf-form-input gf-form-input--form-dropdown"
optionComponent={UserPickerOption}
placeholder="Choose"
/>
</div>
);
}
}
export default UserPicker;

View File

@ -0,0 +1,22 @@
import React from 'react';
import renderer from 'react-test-renderer';
import UserPickerOption from './UserPickerOption';
const model = {
onSelect: () => {},
onFocus: () => {},
isFocused: () => {},
option: {
title: 'Model title',
avatarUrl: 'url/to/avatar',
label: 'User picker label',
},
className: 'class-for-user-picker',
};
describe('UserPickerOption', () => {
it('renders correctly', () => {
const tree = renderer.create(<UserPickerOption {...model} />).toJSON();
expect(tree).toMatchSnapshot();
});
});

View File

@ -0,0 +1,54 @@
import React, { Component } from 'react';
export interface IProps {
onSelect: any;
onFocus: any;
option: any;
isFocused: any;
className: any;
}
class UserPickerOption extends Component<IProps, any> {
constructor(props) {
super(props);
this.handleMouseDown = this.handleMouseDown.bind(this);
this.handleMouseEnter = this.handleMouseEnter.bind(this);
this.handleMouseMove = this.handleMouseMove.bind(this);
}
handleMouseDown(event) {
event.preventDefault();
event.stopPropagation();
this.props.onSelect(this.props.option, event);
}
handleMouseEnter(event) {
this.props.onFocus(this.props.option, event);
}
handleMouseMove(event) {
if (this.props.isFocused) {
return;
}
this.props.onFocus(this.props.option, event);
}
render() {
const { option, children, className } = this.props;
return (
<button
onMouseDown={this.handleMouseDown}
onMouseEnter={this.handleMouseEnter}
onMouseMove={this.handleMouseMove}
title={option.title}
className={`user-picker-option__button btn btn-link ${className}`}
>
<img src={option.avatarUrl} alt={option.label} className="user-picker-option__avatar" />
{children}
</button>
);
}
}
export default UserPickerOption;

View File

@ -0,0 +1,98 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`UserPicker renders correctly 1`] = `
<div
className="user-picker"
>
<div
className="Select width-8 gf-form-input gf-form-input--form-dropdown is-clearable is-loading is-searchable Select--single"
style={undefined}
>
<div
className="Select-control"
onKeyDown={[Function]}
onMouseDown={[Function]}
onTouchEnd={[Function]}
onTouchMove={[Function]}
onTouchStart={[Function]}
style={undefined}
>
<span
className="Select-multi-value-wrapper"
id="react-select-2--value"
>
<div
className="Select-placeholder"
>
Loading...
</div>
<div
className="Select-input"
style={
Object {
"display": "inline-block",
}
}
>
<input
aria-activedescendant="react-select-2--value"
aria-describedby={undefined}
aria-expanded="false"
aria-haspopup="false"
aria-label={undefined}
aria-labelledby={undefined}
aria-owns=""
className={undefined}
id={undefined}
onBlur={[Function]}
onChange={[Function]}
onFocus={[Function]}
required={false}
role="combobox"
style={
Object {
"boxSizing": "content-box",
"width": "5px",
}
}
tabIndex={undefined}
value=""
/>
<div
style={
Object {
"height": 0,
"left": 0,
"overflow": "scroll",
"position": "absolute",
"top": 0,
"visibility": "hidden",
"whiteSpace": "pre",
}
}
>
</div>
</div>
</span>
<span
aria-hidden="true"
className="Select-loading-zone"
>
<span
className="Select-loading"
/>
</span>
<span
className="Select-arrow-zone"
onMouseDown={[Function]}
>
<span
className="Select-arrow"
onMouseDown={[Function]}
/>
</span>
</div>
</div>
</div>
`;

View File

@ -0,0 +1,17 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`UserPickerOption renders correctly 1`] = `
<button
className="user-picker-option__button btn btn-link class-for-user-picker"
onMouseDown={[Function]}
onMouseEnter={[Function]}
onMouseMove={[Function]}
title="Model title"
>
<img
alt="User picker label"
className="user-picker-option__avatar"
src="url/to/avatar"
/>
</button>
`;

View File

@ -56,9 +56,7 @@ function link(scope, elem, attrs) {
let maxLines = attrs.maxLines || DEFAULT_MAX_LINES; let maxLines = attrs.maxLines || DEFAULT_MAX_LINES;
let showGutter = attrs.showGutter !== undefined; let showGutter = attrs.showGutter !== undefined;
let tabSize = attrs.tabSize || DEFAULT_TAB_SIZE; let tabSize = attrs.tabSize || DEFAULT_TAB_SIZE;
let behavioursEnabled = attrs.behavioursEnabled let behavioursEnabled = attrs.behavioursEnabled ? attrs.behavioursEnabled === 'true' : DEFAULT_BEHAVIOURS;
? attrs.behavioursEnabled === 'true'
: DEFAULT_BEHAVIOURS;
// Initialize editor // Initialize editor
let aceElem = elem.get(0); let aceElem = elem.get(0);

View File

@ -12,8 +12,7 @@ export function spectrumPicker() {
require: 'ngModel', require: 'ngModel',
scope: true, scope: true,
replace: true, replace: true,
template: template: '<color-picker color="ngModel.$viewValue" onChange="onColorChange"></color-picker>',
'<color-picker color="ngModel.$viewValue" onChange="onColorChange"></color-picker>',
link: function(scope, element, attrs, ngModel) { link: function(scope, element, attrs, ngModel) {
scope.ngModel = ngModel; scope.ngModel = ngModel;
scope.onColorChange = color => { scope.onColorChange = color => {

View File

@ -32,13 +32,7 @@ export class FormDropdownCtrl {
lookupText: boolean; lookupText: boolean;
/** @ngInject **/ /** @ngInject **/
constructor( constructor(private $scope, $element, private $sce, private templateSrv, private $q) {
private $scope,
$element,
private $sce,
private templateSrv,
private $q
) {
this.inputElement = $element.find('input').first(); this.inputElement = $element.find('input').first();
this.linkElement = $element.find('a').first(); this.linkElement = $element.find('a').first();
this.linkMode = true; this.linkMode = true;
@ -50,8 +44,7 @@ export class FormDropdownCtrl {
if (this.labelMode) { if (this.labelMode) {
this.cssClasses = 'gf-form-label ' + this.cssClass; this.cssClasses = 'gf-form-label ' + this.cssClass;
} else { } else {
this.cssClasses = this.cssClasses = 'gf-form-input gf-form-input--dropdown ' + this.cssClass;
'gf-form-input gf-form-input--dropdown ' + this.cssClass;
} }
this.inputElement.attr('data-provide', 'typeahead'); this.inputElement.attr('data-provide', 'typeahead');
@ -207,16 +200,11 @@ export class FormDropdownCtrl {
updateDisplay(text) { updateDisplay(text) {
this.text = text; this.text = text;
this.display = this.$sce.trustAsHtml( this.display = this.$sce.trustAsHtml(this.templateSrv.highlightVariablesAsHtml(text));
this.templateSrv.highlightVariablesAsHtml(text)
);
} }
open() { open() {
this.inputElement.css( this.inputElement.css('width', Math.max(this.linkElement.width(), 80) + 16 + 'px');
'width',
Math.max(this.linkElement.width(), 80) + 16 + 'px'
);
this.inputElement.show(); this.inputElement.show();
this.inputElement.focus(); this.inputElement.focus();

View File

@ -1,5 +1,3 @@
///<reference path="../../headers/common.d.ts" />
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
const template = ` const template = `

View File

@ -6,32 +6,29 @@ import coreModule from 'app/core/core_module';
import { profiler } from 'app/core/profiler'; import { profiler } from 'app/core/profiler';
import appEvents from 'app/core/app_events'; import appEvents from 'app/core/app_events';
import Drop from 'tether-drop'; import Drop from 'tether-drop';
import { createStore } from 'app/stores/store';
import colors from 'app/core/utils/colors';
export class GrafanaCtrl { export class GrafanaCtrl {
/** @ngInject */ /** @ngInject */
constructor( constructor($scope, alertSrv, utilSrv, $rootScope, $controller, contextSrv, bridgeSrv, backendSrv) {
$scope, createStore(backendSrv);
alertSrv,
utilSrv,
$rootScope,
$controller,
contextSrv,
globalEventSrv
) {
$scope.init = function() { $scope.init = function() {
$scope.contextSrv = contextSrv; $scope.contextSrv = contextSrv;
$scope.appSubUrl = config.appSubUrl;
$rootScope.appSubUrl = config.appSubUrl;
$scope._ = _; $scope._ = _;
profiler.init(config, $rootScope); profiler.init(config, $rootScope);
alertSrv.init(); alertSrv.init();
utilSrv.init(); utilSrv.init();
globalEventSrv.init(); bridgeSrv.init();
$scope.dashAlerts = alertSrv; $scope.dashAlerts = alertSrv;
}; };
$rootScope.colors = colors;
$scope.initDashboard = function(dashboardData, viewScope) { $scope.initDashboard = function(dashboardData, viewScope) {
$scope.appEvent('dashboard-fetch-end', dashboardData); $scope.appEvent('dashboard-fetch-end', dashboardData);
$controller('DashboardCtrl', { $scope: viewScope }).init(dashboardData); $controller('DashboardCtrl', { $scope: viewScope }).init(dashboardData);
@ -54,76 +51,12 @@ export class GrafanaCtrl {
appEvents.emit(name, payload); appEvents.emit(name, payload);
}; };
$rootScope.colors = [
'#7EB26D',
'#EAB839',
'#6ED0E0',
'#EF843C',
'#E24D42',
'#1F78C1',
'#BA43A9',
'#705DA0',
'#508642',
'#CCA300',
'#447EBC',
'#C15C17',
'#890F02',
'#0A437C',
'#6D1F62',
'#584477',
'#B7DBAB',
'#F4D598',
'#70DBED',
'#F9BA8F',
'#F29191',
'#82B5D8',
'#E5A8E2',
'#AEA2E0',
'#629E51',
'#E5AC0E',
'#64B0C8',
'#E0752D',
'#BF1B00',
'#0A50A1',
'#962D82',
'#614D93',
'#9AC48A',
'#F2C96D',
'#65C5DB',
'#F9934E',
'#EA6460',
'#5195CE',
'#D683CE',
'#806EB7',
'#3F6833',
'#967302',
'#2F575E',
'#99440A',
'#58140C',
'#052B51',
'#511749',
'#3F2B5B',
'#E0F9D7',
'#FCEACA',
'#CFFAFF',
'#F9E2D2',
'#FCE2DE',
'#BADFF4',
'#F9D9F9',
'#DEDAF7',
];
$scope.init(); $scope.init();
} }
} }
/** @ngInject */ /** @ngInject */
export function grafanaAppDirective( export function grafanaAppDirective(playlistSrv, contextSrv, $timeout, $rootScope, $location) {
playlistSrv,
contextSrv,
$timeout,
$rootScope
) {
return { return {
restrict: 'E', restrict: 'E',
controller: GrafanaCtrl, controller: GrafanaCtrl,
@ -269,10 +202,7 @@ export function grafanaAppDirective(
// hide search // hide search
if (body.find('.search-container').length > 0) { if (body.find('.search-container').length > 0) {
if ( if (target.parents('.search-results-container, .search-field-wrapper').length === 0) {
target.parents('.search-results-container, .search-field-wrapper')
.length === 0
) {
scope.$apply(function() { scope.$apply(function() {
scope.appEvent('hide-dash-search'); scope.appEvent('hide-dash-search');
}); });
@ -281,10 +211,7 @@ export function grafanaAppDirective(
// hide popovers // hide popovers
var popover = elem.find('.popover'); var popover = elem.find('.popover');
if ( if (popover.length > 0 && target.parents('.graph-legend').length === 0) {
popover.length > 0 &&
target.parents('.graph-legend').length === 0
) {
popover.hide(); popover.hide();
} }
}); });

View File

@ -1,5 +1,3 @@
///<reference path="../../../headers/common.d.ts" />
import coreModule from '../../core_module'; import coreModule from '../../core_module';
import appEvents from 'app/core/app_events'; import appEvents from 'app/core/app_events';

View File

@ -1,5 +1,3 @@
///<reference path="../../headers/common.d.ts" />
import _ from 'lodash'; import _ from 'lodash';
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
import Drop from 'tether-drop'; import Drop from 'tether-drop';
@ -10,10 +8,10 @@ export function infoPopover() {
template: '<i class="fa fa-info-circle"></i>', template: '<i class="fa fa-info-circle"></i>',
transclude: true, transclude: true,
link: function(scope, elem, attrs, ctrl, transclude) { link: function(scope, elem, attrs, ctrl, transclude) {
var offset = attrs.offset || '0 -10px'; let offset = attrs.offset || '0 -10px';
var position = attrs.position || 'right middle'; let position = attrs.position || 'right middle';
var classes = 'drop-help drop-hide-out-of-bounds'; let classes = 'drop-help drop-hide-out-of-bounds';
var openOn = 'hover'; let openOn = 'hover';
elem.addClass('gf-form-help-icon'); elem.addClass('gf-form-help-icon');
@ -26,14 +24,14 @@ export function infoPopover() {
} }
transclude(function(clone, newScope) { transclude(function(clone, newScope) {
var content = document.createElement('div'); let content = document.createElement('div');
content.className = 'markdown-html'; content.className = 'markdown-html';
_.each(clone, node => { _.each(clone, node => {
content.appendChild(node); content.appendChild(node);
}); });
var drop = new Drop({ let dropOptions = {
target: elem[0], target: elem[0],
content: content, content: content,
position: position, position: position,
@ -50,11 +48,16 @@ export function infoPopover() {
}, },
], ],
}, },
}); };
var unbind = scope.$on('$destroy', function() { // Create drop in next digest after directive content is rendered.
drop.destroy(); scope.$applyAsync(() => {
unbind(); let drop = new Drop(dropOptions);
let unbind = scope.$on('$destroy', function() {
drop.destroy();
unbind();
});
}); });
}); });
}, },

View File

@ -103,11 +103,7 @@ export function cssClass(className: string): string {
* Creates a new DOM element wiht given type and class * Creates a new DOM element wiht given type and class
* TODO: move me to helpers * TODO: move me to helpers
*/ */
export function createElement( export function createElement(type: string, className?: string, content?: Element | string): Element {
type: string,
className?: string,
content?: Element | string
): Element {
const el = document.createElement(type); const el = document.createElement(type);
if (className) { if (className) {
el.classList.add(cssClass(className)); el.classList.add(cssClass(className));

View File

@ -1,14 +1,7 @@
// Based on work https://github.com/mohsen1/json-formatter-js // Based on work https://github.com/mohsen1/json-formatter-js
// Licence MIT, Copyright (c) 2015 Mohsen Azimi // Licence MIT, Copyright (c) 2015 Mohsen Azimi
import { import { isObject, getObjectName, getType, getValuePreview, cssClass, createElement } from './helpers';
isObject,
getObjectName,
getType,
getValuePreview,
cssClass,
createElement,
} from './helpers';
import _ from 'lodash'; import _ from 'lodash';
@ -112,9 +105,7 @@ export class JsonExplorer {
private get isDate(): boolean { private get isDate(): boolean {
return ( return (
this.type === 'string' && this.type === 'string' &&
(DATE_STRING_REGEX.test(this.json) || (DATE_STRING_REGEX.test(this.json) || JSON_DATE_REGEX.test(this.json) || PARTIAL_DATE_REGEX.test(this.json))
JSON_DATE_REGEX.test(this.json) ||
PARTIAL_DATE_REGEX.test(this.json))
); );
} }
@ -151,9 +142,7 @@ export class JsonExplorer {
* is this an empty object or array? * is this an empty object or array?
*/ */
private get isEmpty(): boolean { private get isEmpty(): boolean {
return ( return this.isEmptyObject || (this.keys && !this.keys.length && this.isArray);
this.isEmptyObject || (this.keys && !this.keys.length && this.isArray)
);
} }
/* /*
@ -234,11 +223,7 @@ export class JsonExplorer {
} }
isNumberArray() { isNumberArray() {
return ( return this.json.length > 0 && this.json.length < 4 && (_.isNumber(this.json[0]) || _.isNumber(this.json[1]));
this.json.length > 0 &&
this.json.length < 4 &&
(_.isNumber(this.json[0]) || _.isNumber(this.json[1]))
);
} }
renderArray() { renderArray() {
@ -249,17 +234,13 @@ export class JsonExplorer {
if (this.isNumberArray()) { if (this.isNumberArray()) {
this.json.forEach((val, index) => { this.json.forEach((val, index) => {
if (index > 0) { if (index > 0) {
arrayWrapperSpan.appendChild( arrayWrapperSpan.appendChild(createElement('span', 'array-comma', ','));
createElement('span', 'array-comma', ',')
);
} }
arrayWrapperSpan.appendChild(createElement('span', 'number', val)); arrayWrapperSpan.appendChild(createElement('span', 'number', val));
}); });
this.skipChildren = true; this.skipChildren = true;
} else { } else {
arrayWrapperSpan.appendChild( arrayWrapperSpan.appendChild(createElement('span', 'number', this.json.length));
createElement('span', 'number', this.json.length)
);
} }
arrayWrapperSpan.appendChild(createElement('span', 'bracket', ']')); arrayWrapperSpan.appendChild(createElement('span', 'bracket', ']'));
@ -298,11 +279,7 @@ export class JsonExplorer {
const objectWrapperSpan = createElement('span'); const objectWrapperSpan = createElement('span');
// get constructor name and append it to wrapper span // get constructor name and append it to wrapper span
var constructorName = createElement( var constructorName = createElement('span', 'constructor-name', this.constructorName);
'span',
'constructor-name',
this.constructorName
);
objectWrapperSpan.appendChild(constructorName); objectWrapperSpan.appendChild(constructorName);
// if it's an array append the array specific elements like brackets and length // if it's an array append the array specific elements like brackets and length
@ -399,12 +376,7 @@ export class JsonExplorer {
let index = 0; let index = 0;
const addAChild = () => { const addAChild = () => {
const key = this.keys[index]; const key = this.keys[index];
const formatter = new JsonExplorer( const formatter = new JsonExplorer(this.json[key], this.open - 1, this.config, key);
this.json[key],
this.open - 1,
this.config,
key
);
children.appendChild(formatter.render()); children.appendChild(formatter.render());
index += 1; index += 1;
@ -421,12 +393,7 @@ export class JsonExplorer {
requestAnimationFrame(addAChild); requestAnimationFrame(addAChild);
} else { } else {
this.keys.forEach(key => { this.keys.forEach(key => {
const formatter = new JsonExplorer( const formatter = new JsonExplorer(this.json[key], this.open - 1, this.config, key);
this.json[key],
this.open - 1,
this.config,
key
);
children.appendChild(formatter.render()); children.appendChild(formatter.render());
}); });
} }
@ -437,9 +404,7 @@ export class JsonExplorer {
* Animated option is used when user triggers this via a click * Animated option is used when user triggers this via a click
*/ */
removeChildren(animated = false) { removeChildren(animated = false) {
const childrenElement = this.element.querySelector( const childrenElement = this.element.querySelector(`div.${cssClass('children')}`) as HTMLDivElement;
`div.${cssClass('children')}`
) as HTMLDivElement;
if (animated) { if (animated) {
let childrenRemoved = 0; let childrenRemoved = 0;

View File

@ -5,11 +5,11 @@
<i class="gf-form-input-icon fa fa-search"></i> <i class="gf-form-input-icon fa fa-search"></i>
</label> </label>
<div class="page-action-bar__spacer"></div> <div class="page-action-bar__spacer"></div>
<a class="btn btn-success" href="/dashboard/new?folderId={{ctrl.folderId}}"> <a class="btn btn-success" ng-href="{{ctrl.createDashboardUrl()}}">
<i class="fa fa-plus"></i> <i class="fa fa-plus"></i>
Dashboard Dashboard
</a> </a>
<a class="btn btn-success" href="/dashboards/folder/new" ng-if="!ctrl.folderId"> <a class="btn btn-success" href="dashboards/folder/new" ng-if="!ctrl.folderId">
<i class="fa fa-plus"></i> <i class="fa fa-plus"></i>
Folder Folder
</a> </a>
@ -60,22 +60,20 @@
switch-class="gf-form-switch--transparent gf-form-switch--search-result-filter-row__checkbox" switch-class="gf-form-switch--transparent gf-form-switch--search-result-filter-row__checkbox"
/> />
<div class="search-results-filter-row__filters"> <div class="search-results-filter-row__filters">
<div class="gf-form-select-wrapper"> <div class="gf-form-select-wrapper" ng-show="!(ctrl.canMove || ctrl.canDelete)">
<select <select
class="search-results-filter-row__filters-item gf-form-input" class="search-results-filter-row__filters-item gf-form-input"
ng-model="ctrl.selectedStarredFilter" ng-model="ctrl.selectedStarredFilter"
ng-options="t.text disable when t.disabled for t in ctrl.starredFilterOptions" ng-options="t.text disable when t.disabled for t in ctrl.starredFilterOptions"
ng-change="ctrl.onStarredFilterChange()" ng-change="ctrl.onStarredFilterChange()"
ng-show="!(ctrl.canMove || ctrl.canDelete)"
/> />
</div> </div>
<div class="gf-form-select-wrapper"> <div class="gf-form-select-wrapper" ng-show="!(ctrl.canMove || ctrl.canDelete)">
<select <select
class="search-results-filter-row__filters-item gf-form-input" class="search-results-filter-row__filters-item gf-form-input"
ng-model="ctrl.selectedTagFilter" ng-model="ctrl.selectedTagFilter"
ng-options="t.term disable when t.disabled for t in ctrl.tagFilterOptions" ng-options="t.term disable when t.disabled for t in ctrl.tagFilterOptions"
ng-change="ctrl.onTagFilterChange()" ng-change="ctrl.onTagFilterChange()"
ng-show="!(ctrl.canMove || ctrl.canDelete)"
/> />
</div> </div>
<div class="gf-form-button-row" ng-show="ctrl.canMove || ctrl.canDelete"> <div class="gf-form-button-row" ng-show="ctrl.canMove || ctrl.canDelete">
@ -110,11 +108,11 @@
<empty-list-cta model="{ <empty-list-cta model="{
title: 'This folder doesn\'t have any dashboards yet', title: 'This folder doesn\'t have any dashboards yet',
buttonIcon: 'gicon gicon-dashboard-new', buttonIcon: 'gicon gicon-dashboard-new',
buttonLink: '/dashboard/new?folderId={{ctrl.folderId}}', buttonLink: 'dashboard/new?folderId={{ctrl.folderId}}',
buttonTitle: 'Create Dashboard', buttonTitle: 'Create Dashboard',
proTip: 'Add dashboards into your folder at ->', proTip: 'Add dashboards into your folder at ->',
proTipLink: '/dashboards', proTipLink: 'dashboards',
proTipLinkTitle: 'Manage dashboards', proTipLinkTitle: 'Manage dashboards',
proTipTarget: '_blank' proTipTarget: ''
}" /> }" />
</div> </div>

View File

@ -13,11 +13,7 @@ export class ManageDashboardsCtrl {
canMove = false; canMove = false;
hasFilters = false; hasFilters = false;
selectAllChecked = false; selectAllChecked = false;
starredFilterOptions = [ starredFilterOptions = [{ text: 'Filter by Starred', disabled: true }, { text: 'Yes' }, { text: 'No' }];
{ text: 'Filter by Starred', disabled: true },
{ text: 'Yes' },
{ text: 'No' },
];
selectedStarredFilter: any; selectedStarredFilter: any;
folderId?: number; folderId?: number;
@ -53,10 +49,7 @@ export class ManageDashboardsCtrl {
this.canMove = false; this.canMove = false;
this.canDelete = false; this.canDelete = false;
this.selectAllChecked = false; this.selectAllChecked = false;
this.hasFilters = this.hasFilters = this.query.query.length > 0 || this.query.tag.length > 0 || this.query.starred;
this.query.query.length > 0 ||
this.query.tag.length > 0 ||
this.query.starred;
if (!result) { if (!result) {
this.sections = []; this.sections = [];
@ -126,16 +119,10 @@ export class ManageDashboardsCtrl {
let text2; let text2;
if (folderCount > 0 && dashCount > 0) { if (folderCount > 0 && dashCount > 0) {
text += `selected folder${folderCount === 1 ? '' : 's'} and dashboard${ text += `selected folder${folderCount === 1 ? '' : 's'} and dashboard${dashCount === 1 ? '' : 's'}?`;
dashCount === 1 ? '' : 's' text2 = `All dashboards of the selected folder${folderCount === 1 ? '' : 's'} will also be deleted`;
}?`;
text2 = `All dashboards of the selected folder${
folderCount === 1 ? '' : 's'
} will also be deleted`;
} else if (folderCount > 0) { } else if (folderCount > 0) {
text += `selected folder${ text += `selected folder${folderCount === 1 ? '' : 's'} and all its dashboards?`;
folderCount === 1 ? '' : 's'
} and all its dashboards?`;
} else { } else {
text += `selected dashboard${dashCount === 1 ? '' : 's'}?`; text += `selected dashboard${dashCount === 1 ? '' : 's'}?`;
} }
@ -165,22 +152,16 @@ export class ManageDashboardsCtrl {
let msg; let msg;
if (folderCount > 0 && dashCount > 0) { if (folderCount > 0 && dashCount > 0) {
header = `Folder${folderCount === 1 ? '' : 's'} And Dashboard${ header = `Folder${folderCount === 1 ? '' : 's'} And Dashboard${dashCount === 1 ? '' : 's'} Deleted`;
dashCount === 1 ? '' : 's'
} Deleted`;
msg = `${folderCount} folder${folderCount === 1 ? '' : 's'} `; msg = `${folderCount} folder${folderCount === 1 ? '' : 's'} `;
msg += `and ${dashCount} dashboard${ msg += `and ${dashCount} dashboard${dashCount === 1 ? '' : 's'} has been deleted`;
dashCount === 1 ? '' : 's'
} has been deleted`;
} else if (folderCount > 0) { } else if (folderCount > 0) {
header = `Folder${folderCount === 1 ? '' : 's'} Deleted`; header = `Folder${folderCount === 1 ? '' : 's'} Deleted`;
if (folderCount === 1) { if (folderCount === 1) {
msg = `${folders[0].dashboard.title} has been deleted`; msg = `${folders[0].dashboard.title} has been deleted`;
} else { } else {
msg = `${folderCount} folder${ msg = `${folderCount} folder${folderCount === 1 ? '' : 's'} has been deleted`;
folderCount === 1 ? '' : 's'
} has been deleted`;
} }
} else if (dashCount > 0) { } else if (dashCount > 0) {
header = `Dashboard${dashCount === 1 ? '' : 's'} Deleted`; header = `Dashboard${dashCount === 1 ? '' : 's'} Deleted`;
@ -188,9 +169,7 @@ export class ManageDashboardsCtrl {
if (dashCount === 1) { if (dashCount === 1) {
msg = `${dashboards[0].dashboard.title} has been deleted`; msg = `${dashboards[0].dashboard.title} has been deleted`;
} else { } else {
msg = `${dashCount} dashboard${ msg = `${dashCount} dashboard${dashCount === 1 ? '' : 's'} has been deleted`;
dashCount === 1 ? '' : 's'
} has been deleted`;
} }
} }
@ -231,9 +210,7 @@ export class ManageDashboardsCtrl {
getTags() { getTags() {
return this.searchSrv.getDashboardTags().then(results => { return this.searchSrv.getDashboardTags().then(results => {
this.tagFilterOptions = [ this.tagFilterOptions = [{ term: 'Filter By Tag', disabled: true }].concat(results);
{ term: 'Filter By Tag', disabled: true },
].concat(results);
this.selectedTagFilter = this.tagFilterOptions[0]; this.selectedTagFilter = this.tagFilterOptions[0];
}); });
} }
@ -297,13 +274,22 @@ export class ManageDashboardsCtrl {
this.query.starred = false; this.query.starred = false;
this.getDashboards(); this.getDashboards();
} }
createDashboardUrl() {
let url = 'dashboard/new';
if (this.folderId) {
url += `?folderId=${this.folderId}`;
}
return url;
}
} }
export function manageDashboardsDirective() { export function manageDashboardsDirective() {
return { return {
restrict: 'E', restrict: 'E',
templateUrl: templateUrl: 'public/app/core/components/manage_dashboards/manage_dashboards.html',
'public/app/core/components/manage_dashboards/manage_dashboards.html',
controller: ManageDashboardsCtrl, controller: ManageDashboardsCtrl,
bindToController: true, bindToController: true,
controllerAs: 'ctrl', controllerAs: 'ctrl',

View File

@ -1,5 +1,3 @@
///<reference path="../../headers/common.d.ts" />
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
import { contextSrv } from 'app/core/services/context_srv'; import { contextSrv } from 'app/core/services/context_srv';
@ -63,9 +61,7 @@ export class OrgSwitchCtrl {
setUsingOrg(org) { setUsingOrg(org) {
return this.backendSrv.post('/api/user/using/' + org.orgId).then(() => { return this.backendSrv.post('/api/user/using/' + org.orgId).then(() => {
const re = /orgId=\d+/gi; const re = /orgId=\d+/gi;
this.setWindowLocationHref( this.setWindowLocationHref(this.getWindowLocationHref().replace(re, 'orgId=' + org.orgId));
this.getWindowLocationHref().replace(re, 'orgId=' + org.orgId)
);
}); });
} }

View File

@ -1,5 +1,3 @@
///<reference path="../../../headers/common.d.ts" />
import _ from 'lodash'; import _ from 'lodash';
export class QueryPartDef { export class QueryPartDef {

View File

@ -1,5 +1,3 @@
///<reference path="../../../headers/common.d.ts" />
import _ from 'lodash'; import _ from 'lodash';
import $ from 'jquery'; import $ from 'jquery';
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
@ -17,8 +15,7 @@ var template = `
/** @ngInject */ /** @ngInject */
export function queryPartEditorDirective($compile, templateSrv) { export function queryPartEditorDirective($compile, templateSrv) {
var paramTemplate = var paramTemplate = '<input type="text" class="hide input-mini tight-form-func-param"></input>';
'<input type="text" class="hide input-mini tight-form-func-param"></input>';
return { return {
restrict: 'E', restrict: 'E',
@ -102,14 +99,12 @@ export function queryPartEditorDirective($compile, templateSrv) {
} }
$scope.$apply(function() { $scope.$apply(function() {
$scope $scope.handleEvent({ $event: { name: 'get-param-options' } }).then(function(result) {
.handleEvent({ $event: { name: 'get-param-options' } }) var dynamicOptions = _.map(result, function(op) {
.then(function(result) { return op.value;
var dynamicOptions = _.map(result, function(op) {
return op.value;
});
callback(dynamicOptions);
}); });
callback(dynamicOptions);
});
}); });
}; };
@ -136,11 +131,9 @@ export function queryPartEditorDirective($compile, templateSrv) {
} }
$scope.showActionsMenu = function() { $scope.showActionsMenu = function() {
$scope $scope.handleEvent({ $event: { name: 'get-part-actions' } }).then(res => {
.handleEvent({ $event: { name: 'get-part-actions' } }) $scope.partActions = res;
.then(res => { });
$scope.partActions = res;
});
}; };
$scope.triggerPartAction = function(action) { $scope.triggerPartAction = function(action) {
@ -157,12 +150,8 @@ export function queryPartEditorDirective($compile, templateSrv) {
$('<span>, </span>').appendTo($paramsContainer); $('<span>, </span>').appendTo($paramsContainer);
} }
var paramValue = templateSrv.highlightVariablesAsHtml( var paramValue = templateSrv.highlightVariablesAsHtml(part.params[index]);
part.params[index] var $paramLink = $('<a class="graphite-func-param-link pointer">' + paramValue + '</a>');
);
var $paramLink = $(
'<a class="graphite-func-param-link pointer">' + paramValue + '</a>'
);
var $input = $(paramTemplate); var $input = $(paramTemplate);
$paramLink.appendTo($paramsContainer); $paramLink.appendTo($paramsContainer);

View File

@ -1,5 +1,6 @@
import PerfectScrollbar from 'perfect-scrollbar'; import PerfectScrollbar from 'perfect-scrollbar';
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
import appEvents from 'app/core/app_events';
export function geminiScrollbar() { export function geminiScrollbar() {
return { return {
@ -7,6 +8,19 @@ export function geminiScrollbar() {
link: function(scope, elem, attrs) { link: function(scope, elem, attrs) {
let scrollbar = new PerfectScrollbar(elem[0]); let scrollbar = new PerfectScrollbar(elem[0]);
appEvents.on(
'smooth-scroll-top',
() => {
elem.animate(
{
scrollTop: 0,
},
500
);
},
scope
);
scope.$on('$routeChangeSuccess', () => { scope.$on('$routeChangeSuccess', () => {
elem[0].scrollTop = 0; elem[0].scrollTop = 0;
}); });

View File

@ -0,0 +1,86 @@
import React from "react";
import classNames from "classnames";
import { observer } from "mobx-react";
import { store } from "app/stores/store";
export interface SearchResultProps {
search: any;
}
@observer
export class SearchResult extends React.Component<SearchResultProps, any> {
constructor(props) {
super(props);
this.state = {
search: store.search
};
store.search.query();
}
render() {
return this.state.search.sections.map(section => {
return <SearchResultSection section={section} key={section.id} />;
});
}
}
export interface SectionProps {
section: any;
}
@observer
export class SearchResultSection extends React.Component<SectionProps, any> {
constructor(props) {
super(props);
}
renderItem(item) {
return (
<a className="search-item" href={item.url} key={item.id}>
<span className="search-item__icon">
<i className="fa fa-th-large" />
</span>
<span className="search-item__body">
<div className="search-item__body-title">{item.title}</div>
</span>
</a>
);
}
toggleSection = () => {
this.props.section.toggle();
};
render() {
let collapseClassNames = classNames({
fa: true,
"fa-plus": !this.props.section.expanded,
"fa-minus": this.props.section.expanded,
"search-section__header__toggle": true
});
return (
<div className="search-section" key={this.props.section.id}>
<div className="search-section__header">
<i
className={classNames(
"search-section__header__icon",
this.props.section.icon
)}
/>
<span className="search-section__header__text">
{this.props.section.title}
</span>
<i className={collapseClassNames} onClick={this.toggleSection} />
</div>
{this.props.section.expanded && (
<div className="search-section__items">
{this.props.section.items.map(this.renderItem)}
</div>
)}
</div>
);
}
}

Some files were not shown because too many files have changed in this diff Show More