Merge branch 'master' into cli/watch-sass-var

This commit is contained in:
Dominik Prokop 2019-02-12 17:19:45 +01:00
commit 0a66d8afc7
215 changed files with 5579 additions and 3093 deletions

View File

@ -1,13 +1,39 @@
# 6.0.0-beta2 (unreleased)
# 6.0.0-beta3 (unreleased)
# 6.0.0-beta2 (2019-02-11)
### New Features
* **AzureMonitor**: Enable alerting by converting Azure Monitor API to Go [#14623](https://github.com/grafana/grafana/issues/14623)
### Minor
* **Pushover**: Adds support for images in pushover notifier [#10780](https://github.com/grafana/grafana/issues/10780), thx [@jpenalbae](https://github.com/jpenalbae)
* **Alerting**: Adds support for images in pushover notifier [#10780](https://github.com/grafana/grafana/issues/10780), thx [@jpenalbae](https://github.com/jpenalbae)
* **Graphite/InfluxDB/OpenTSDB**: Fix always take dashboard timezone into consideration when handle custom time ranges [#15284](https://github.com/grafana/grafana/issues/15284)
* **Stackdriver**: Template variables in filters using globbing format [#15182](https://github.com/grafana/grafana/issues/15182)
* **Cloudwatch**: Add `resource_arns` template variable query function [#8207](https://github.com/grafana/grafana/issues/8207), thx [@jeroenvollenbrock](https://github.com/jeroenvollenbrock)
* **Cloudwatch**: Add AWS/Neptune metrics [#14231](https://github.com/grafana/grafana/issues/14231), thx [@tcpatterson](https://github.com/tcpatterson)
* **Cloudwatch**: Add AWS/EC2/API metrics [#14233](https://github.com/grafana/grafana/issues/14233), thx [@tcpatterson](https://github.com/tcpatterson)
* **Cloudwatch**: Add AWS RDS ServerlessDatabaseCapacity metric [#15265](https://github.com/grafana/grafana/pull/15265), thx [@larsjoergensen](https://github.com/larsjoergensen)
* **MySQL**: Adds datasource SSL CA/client certificates support [#8570](https://github.com/grafana/grafana/issues/8570), thx [@bugficks](https://github.com/bugficks)
* **MSSQL**: Timerange are now passed for template variable queries [#13324](https://github.com/grafana/grafana/issues/13324), thx [@thatsparesh](https://github.com/thatsparesh)
* **Annotations**: Support PATCH verb in annotations http api [#12546](https://github.com/grafana/grafana/issues/12546), thx [@SamuelToh](https://github.com/SamuelToh)
* **Templating**: Add json formatting to variable interpolation [#15291](https://github.com/grafana/grafana/issues/15291), thx [@mtanda](https://github.com/mtanda)
* **Login**: Anonymous usage stats for token auth [#15288](https://github.com/grafana/grafana/issues/15288)
* **AzureMonitor**: improve autocomplete for Log Analytics and App Insights editor [#15131](https://github.com/grafana/grafana/issues/15131)
* **LDAP**: Fix IPA/FreeIPA v4.6.4 does not allow LDAP searches with empty attributes [#14432](https://github.com/grafana/grafana/issues/14432)
### Breaking changes
* **Internal Metrics** Edition has been added to the build_info metric. This will break any Graphite queries using this metric. Edition will be a new label for the Prometheus metric. [#15363](https://github.com/grafana/grafana/pull/15363)
### 6.0.0-beta1 fixes
* **Postgres**: Fix default port not added when port not configured [#15189](https://github.com/grafana/grafana/issues/15189)
* **Alerting**: Fixes crash bug when alert notifier folders are missing [#15295](https://github.com/grafana/grafana/issues/15295)
* **Dashboard**: Fix save provisioned dashboard modal [#15219](https://github.com/grafana/grafana/pull/15219)
* **Dashboard**: Fix having a long query in prometheus dashboard query editor blocks 30% of the query field when on OSX and having native scrollbars [#15122](https://github.com/grafana/grafana/issues/15122)
* **Explore**: Fix issue with wrapping on long queries [#15222](https://github.com/grafana/grafana/issues/15222)
* **Explore**: Fix cut & paste adds newline before and after selection [#15223](https://github.com/grafana/grafana/issues/15223)
* **Dataproxy**: Fix global datasource proxy timeout not added to correct http client [#15258](https://github.com/grafana/grafana/issues/15258) [#5699](https://github.com/grafana/grafana/issues/5699)
# 6.0.0-beta1 (2019-01-30)
@ -87,7 +113,7 @@
* **Stackdriver**: Fixes issue with data proxy and Authorization header [#14262](https://github.com/grafana/grafana/issues/14262)
* **Units**: fixedUnit for Flow:l/min and mL/min [#14294](https://github.com/grafana/grafana/issues/14294), thx [@flopp999](https://github.com/flopp999).
* **Logging**: Fix for issue where data proxy logged a secret when debug logging was enabled, now redacted. [#14319](https://github.com/grafana/grafana/issues/14319)
* **InfluxDB**: Add support for alerting on InfluxDB queries that use the cumulative_sum function. [#14314](https://github.com/grafana/grafana/pull/14314), thx [@nitti](https://github.com/nitti)
* TSDB**: Fix always take dashboard timezone into consideration when handle custom time ranges**: Add support for alerting on InfluxDB queries that use the cumulative_sum function. [#14314](https://github.com/grafana/grafana/pull/14314), thx [@nitti](https://github.com/nitti)
* **Plugins**: Panel plugins should no receive the panel-initialized event again as usual.
* **Embedded Graphs**: Iframe graph panels should now work as usual. [#14284](https://github.com/grafana/grafana/issues/14284)
* **Postgres**: Improve PostgreSQL Query Editor if using different Schemas, [#14313](
@ -1022,7 +1048,7 @@ Pull Request: [#8472](https://github.com/grafana/grafana/pull/8472)
* **Docs**: Added some details about Sessions in Postgres [#7694](https://github.com/grafana/grafana/pull/7694) thx [@rickard-von-essen](https://github.com/rickard-von-essen)
* **Influxdb**: Allow commas in template variables [#7681](https://github.com/grafana/grafana/issues/7681) thx [@thuck](https://github.com/thuck)
* **Cloudwatch**: stop using deprecated session.New() [#7736](https://github.com/grafana/grafana/issues/7736) thx [@mtanda](https://github.com/mtanda)
* **OpenTSDB**: Pass dropcounter rate option if no max counter and no reset value or reset value as 0 is specified [#7743](https://github.com/grafana/grafana/pull/7743) thx [@r4um](https://github.com/r4um)
*TSDB**: Fix always take dashboard timezone into consideration when handle custom time ranges**: Pass dropcounter rate option if no max counter and no reset value or reset value as 0 is specified [#7743](https://github.com/grafana/grafana/pull/7743) thx [@r4um](https://github.com/r4um)
* **Templating**: support full resolution for $interval variable [#7696](https://github.com/grafana/grafana/pull/7696) thx [@mtanda](https://github.com/mtanda)
* **Elasticsearch**: Unique Count on string fields in ElasticSearch [#3536](https://github.com/grafana/grafana/issues/3536), thx [@pyro2927](https://github.com/pyro2927)
* **Templating**: Data source template variable that refers to other variable in regex filter [#6365](https://github.com/grafana/grafana/issues/6365) thx [@rlodge](https://github.com/rlodge)

View File

@ -64,6 +64,7 @@ RUN mkdir -p "$GF_PATHS_HOME/.aws" && \
useradd -r -u $GF_UID -g grafana grafana && \
mkdir -p "$GF_PATHS_PROVISIONING/datasources" \
"$GF_PATHS_PROVISIONING/dashboards" \
"$GF_PATHS_PROVISIONING/notifiers" \
"$GF_PATHS_LOGS" \
"$GF_PATHS_PLUGINS" \
"$GF_PATHS_DATA" && \

View File

@ -7,13 +7,18 @@
Grafana is an open source, feature rich metrics dashboard and graph editor for
Graphite, Elasticsearch, OpenTSDB, Prometheus and InfluxDB.
![](https://www.grafanacon.org/2019/images/grafanacon_la_nav-logo.png)
Join us Feb 25-26 in Los Angeles, California for GrafanaCon - a two-day event with talks focused on Grafana and the surrounding open source monitoring ecosystem. Get deep dives into Loki, the Explore workflow and all of the new features of Grafana 6, plus participate in hands on workshops to help you get the most out of your data.
Time is running out - grab your ticket now! http://grafanacon.org
<!---
![](http://docs.grafana.org/assets/img/features/dashboard_ex1.png)
-->
## Installation
Head to [docs.grafana.org](http://docs.grafana.org/installation/) and [download](https://grafana.com/get)
the latest release.
If you have any problems please read the [troubleshooting guide](http://docs.grafana.org/installation/troubleshooting/).
Head to [docs.grafana.org](http://docs.grafana.org/installation/) for documentation or [download](https://grafana.com/get) to get the latest release.
## Documentation & Support
Be sure to read the [getting started guide](http://docs.grafana.org/guides/gettingstarted/) and the other feature guides.

View File

@ -1,6 +1,3 @@
# You need to run 'sysctl -w vm.max_map_count=262144' on the host machine
version: '2'
services:
elasticsearch5:
image: elasticsearch:5
command: elasticsearch

View File

@ -5,7 +5,7 @@
"company": "Grafana Labs"
},
"name": "grafana",
"version": "6.0.0-prebeta2",
"version": "6.0.0-pre3",
"repository": {
"type": "git",
"url": "http://github.com/grafana/grafana.git"
@ -28,6 +28,7 @@
"@types/react-dom": "^16.0.9",
"@types/react-grid-layout": "^0.16.6",
"@types/react-select": "^2.0.4",
"@types/react-transition-group": "^2.0.15",
"@types/react-virtualized": "^9.18.12",
"angular-mocks": "1.6.6",
"autoprefixer": "^6.4.0",
@ -66,10 +67,10 @@
"html-loader": "^0.5.1",
"html-webpack-harddisk-plugin": "^0.2.0",
"html-webpack-plugin": "^3.2.0",
"husky": "^0.14.3",
"husky": "^1.3.1",
"jest": "^23.6.0",
"jest-date-mock": "^1.0.6",
"lint-staged": "^6.0.0",
"lint-staged": "^8.1.3",
"load-grunt-tasks": "3.5.2",
"log-timestamp": "^0.2.1",
"mini-css-extract-plugin": "^0.4.0",
@ -156,6 +157,7 @@
"dependencies": {
"@babel/polyfill": "^7.0.0",
"@torkelo/react-select": "2.1.1",
"@types/reselect": "^2.2.0",
"angular": "1.6.6",
"angular-bindonce": "0.3.1",
"angular-native-dragdrop": "1.2.2",
@ -192,6 +194,7 @@
"redux-logger": "^3.0.6",
"redux-thunk": "^2.3.0",
"remarkable": "^1.7.1",
"reselect": "^4.0.0",
"rst2html": "github:thoward/rst2html#990cb89",
"rxjs": "^6.3.3",
"slate": "^0.33.4",

View File

@ -69,8 +69,8 @@ export class AxisSelector extends React.PureComponent<AxisSelectorProps, AxisSel
}
render() {
const leftButtonClass = this.state.yaxis === 1 ? 'btn-success' : 'btn-inverse';
const rightButtonClass = this.state.yaxis === 2 ? 'btn-success' : 'btn-inverse';
const leftButtonClass = this.state.yaxis === 1 ? 'btn-primary' : 'btn-inverse';
const rightButtonClass = this.state.yaxis === 2 ? 'btn-primary' : 'btn-inverse';
return (
<div className="p-b-1">

View File

@ -29,14 +29,14 @@
&:hover {
.panel-options-group__add-circle {
background-color: $btn-success-bg;
background-color: $btn-primary-bg;
color: $white;
}
}
}
.panel-options-group__add-circle {
@include gradientBar($btn-success-bg, $btn-success-bg-hl);
@include gradientBar($btn-success-bg, $btn-success-bg-hl, #fff);
border-radius: 50px;
width: 20px;

View File

@ -1,4 +1,4 @@
import React from 'react';
import React, { ChangeEvent } from 'react';
import { shallow } from 'enzyme';
import { ThresholdsEditor, Props } from './ThresholdsEditor';
@ -118,7 +118,7 @@ describe('change threshold value', () => {
];
const instance = setup({ thresholds });
const mockEvent = { target: { value: 12 } };
const mockEvent = ({ target: { value: '12' } } as any) as ChangeEvent<HTMLInputElement>;
instance.onChangeThresholdValue(mockEvent, thresholds[0]);
@ -137,7 +137,7 @@ describe('change threshold value', () => {
thresholds,
};
const mockEvent = { target: { value: 78 } };
const mockEvent = ({ target: { value: '78' } } as any) as ChangeEvent<HTMLInputElement>;
instance.onChangeThresholdValue(mockEvent, thresholds[1]);

View File

@ -1,4 +1,4 @@
import React, { PureComponent } from 'react';
import React, { PureComponent, ChangeEvent } from 'react';
import { Threshold } from '../../types';
import { ColorPicker } from '../ColorPicker/ColorPicker';
import { PanelOptionsGroup } from '../PanelOptionsGroup/PanelOptionsGroup';
@ -94,14 +94,15 @@ export class ThresholdsEditor extends PureComponent<Props, State> {
);
};
onChangeThresholdValue = (event: any, threshold: Threshold) => {
onChangeThresholdValue = (event: ChangeEvent<HTMLInputElement>, threshold: Threshold) => {
if (threshold.index === 0) {
return;
}
const { thresholds } = this.state;
const parsedValue = parseInt(event.target.value, 10);
const value = isNaN(parsedValue) ? null : parsedValue;
const cleanValue = event.target.value.replace(/,/g, '.');
const parsedValue = parseFloat(cleanValue);
const value = isNaN(parsedValue) ? '' : parsedValue;
const newThresholds = thresholds.map(t => {
if (t === threshold && t.index !== 0) {
@ -164,16 +165,14 @@ export class ThresholdsEditor extends PureComponent<Props, State> {
<div className="thresholds-row-input-inner-color">
{threshold.color && (
<div className="thresholds-row-input-inner-color-colorpicker">
<ColorPicker
color={threshold.color}
onChange={color => this.onChangeThresholdColor(threshold, color)}
/>
<ColorPicker color={threshold.color} onChange={color => this.onChangeThresholdColor(threshold, color)} />
</div>
)}
</div>
<div className="thresholds-row-input-inner-value">
<input
type="text"
type="number"
step="0.0001"
onChange={event => this.onChangeThresholdValue(event, threshold)}
value={value}
onBlur={this.onBlur}

View File

@ -21,7 +21,7 @@
}
.thresholds-row-add-button {
@include buttonBackground($btn-success-bg, $btn-success-bg-hl);
@include buttonBackground($btn-success-bg, $btn-success-bg-hl, #fff);
align-self: center;
margin-right: 5px;

View File

@ -31,7 +31,7 @@ $popper-margin-from-ref: 5px;
// Themes
&.popper__background--error {
@include popper-theme($tooltipBackgroundError, $tooltipBackgroundError);
@include popper-theme($tooltipBackgroundError, $white);
}
&.popper__background--info {

View File

@ -44,8 +44,8 @@ describe('colors', () => {
});
describe('getColorFromHexRgbOrName', () => {
it('returns undefined for unknown color', () => {
expect(() => getColorFromHexRgbOrName('aruba-sunshine')).toThrow();
it('returns black for unknown color', () => {
expect(getColorFromHexRgbOrName('aruba-sunshine')).toBe('#000000');
});
it('returns dark hex variant for known color if theme not specified', () => {
@ -64,5 +64,9 @@ describe('colors', () => {
expect(getColorFromHexRgbOrName('rgb(0,0,0)')).toBe('rgb(0,0,0)');
expect(getColorFromHexRgbOrName('rgba(0,0,0,1)')).toBe('rgba(0,0,0,1)');
});
it('returns hex for named color that is not a part of named colors palette', () => {
expect(getColorFromHexRgbOrName('lime')).toBe('#00ff00');
});
});
});

View File

@ -1,5 +1,6 @@
import { flatten } from 'lodash';
import { GrafanaThemeType } from '../types';
import tinycolor from 'tinycolor2';
type Hue = 'green' | 'yellow' | 'red' | 'blue' | 'orange' | 'purple';
@ -69,7 +70,9 @@ export const getColorDefinitionByName = (name: Color): ColorDefinition => {
};
export const getColorDefinition = (hex: string, theme: GrafanaThemeType): ColorDefinition | undefined => {
return flatten(Array.from(getNamedColorPalette().values())).filter(definition => definition.variants[theme] === hex)[0];
return flatten(Array.from(getNamedColorPalette().values())).filter(
definition => definition.variants[theme] === hex
)[0];
};
const isHex = (color: string) => {
@ -94,7 +97,9 @@ export const getColorName = (color?: string, theme?: GrafanaThemeType): Color |
};
export const getColorByName = (colorName: string) => {
const definition = flatten(Array.from(getNamedColorPalette().values())).filter(definition => definition.name === colorName);
const definition = flatten(Array.from(getNamedColorPalette().values())).filter(
definition => definition.name === colorName
);
return definition.length > 0 ? definition[0] : undefined;
};
@ -106,7 +111,7 @@ export const getColorFromHexRgbOrName = (color: string, theme?: GrafanaThemeType
const colorDefinition = getColorByName(color);
if (!colorDefinition) {
throw new Error('Unknown color');
return new tinycolor(color).toHexString();
}
return theme ? colorDefinition.variants[theme] : colorDefinition.variants.dark;

View File

@ -31,11 +31,16 @@ case "$1" in
cp /usr/share/grafana/conf/ldap.toml /etc/grafana/ldap.toml
fi
if [ ! -f $PROVISIONING_CFG_DIR ]; then
if [ ! -d $PROVISIONING_CFG_DIR ]; then
mkdir -p $PROVISIONING_CFG_DIR/dashboards $PROVISIONING_CFG_DIR/datasources
cp /usr/share/grafana/conf/provisioning/dashboards/sample.yaml $PROVISIONING_CFG_DIR/dashboards/sample.yaml
cp /usr/share/grafana/conf/provisioning/datasources/sample.yaml $PROVISIONING_CFG_DIR/datasources/sample.yaml
fi
fi
if [ ! -d $PROVISIONING_CFG_DIR/notifiers ]; then
mkdir -p $PROVISIONING_CFG_DIR/notifiers
cp /usr/share/grafana/conf/provisioning/notifiers/sample.yaml $PROVISIONING_CFG_DIR/notifiers/sample.yaml
fi
# configuration files should not be modifiable by grafana user, as this can be a security issue
chown -Rh root:$GRAFANA_GROUP /etc/grafana/*

View File

@ -39,6 +39,7 @@ RUN mkdir -p "$GF_PATHS_HOME/.aws" && \
useradd -r -u $GF_UID -g grafana grafana && \
mkdir -p "$GF_PATHS_PROVISIONING/datasources" \
"$GF_PATHS_PROVISIONING/dashboards" \
"$GF_PATHS_PROVISIONING/notifiers" \
"$GF_PATHS_LOGS" \
"$GF_PATHS_PLUGINS" \
"$GF_PATHS_DATA" && \

View File

@ -45,11 +45,16 @@ if [ $1 -eq 1 ] ; then
cp /usr/share/grafana/conf/ldap.toml /etc/grafana/ldap.toml
fi
if [ ! -f $PROVISIONING_CFG_DIR ]; then
if [ ! -d $PROVISIONING_CFG_DIR ]; then
mkdir -p $PROVISIONING_CFG_DIR/dashboards $PROVISIONING_CFG_DIR/datasources
cp /usr/share/grafana/conf/provisioning/dashboards/sample.yaml $PROVISIONING_CFG_DIR/dashboards/sample.yaml
cp /usr/share/grafana/conf/provisioning/datasources/sample.yaml $PROVISIONING_CFG_DIR/datasources/sample.yaml
fi
fi
if [ ! -d $PROVISIONING_CFG_DIR/notifiers ]; then
mkdir -p $PROVISIONING_CFG_DIR/notifiers
cp /usr/share/grafana/conf/provisioning/notifiers/sample.yaml $PROVISIONING_CFG_DIR/notifiers/sample.yaml
fi
# Set user permissions on /var/log/grafana, /var/lib/grafana
mkdir -p /var/log/grafana /var/lib/grafana

View File

@ -16,7 +16,7 @@ func (hs *HTTPServer) registerRoutes() {
reqOrgAdmin := middleware.ReqOrgAdmin
redirectFromLegacyDashboardURL := middleware.RedirectFromLegacyDashboardURL()
redirectFromLegacyDashboardSoloURL := middleware.RedirectFromLegacyDashboardSoloURL()
quota := middleware.Quota
quota := middleware.Quota(hs.QuotaService)
bind := binding.Bind
r := hs.RouteRegister
@ -286,7 +286,7 @@ func (hs *HTTPServer) registerRoutes() {
dashboardRoute.Post("/calculate-diff", bind(dtos.CalculateDiffOptions{}), Wrap(CalculateDashboardDiff))
dashboardRoute.Post("/db", bind(m.SaveDashboardCommand{}), Wrap(PostDashboard))
dashboardRoute.Post("/db", bind(m.SaveDashboardCommand{}), Wrap(hs.PostDashboard))
dashboardRoute.Get("/home", Wrap(GetHomeDashboard))
dashboardRoute.Get("/tags", GetDashboardTags)
dashboardRoute.Post("/import", bind(dtos.ImportDashboardCommand{}), Wrap(ImportDashboard))
@ -294,7 +294,7 @@ func (hs *HTTPServer) registerRoutes() {
dashboardRoute.Group("/id/:dashboardId", func(dashIdRoute routing.RouteRegister) {
dashIdRoute.Get("/versions", Wrap(GetDashboardVersions))
dashIdRoute.Get("/versions/:id", Wrap(GetDashboardVersion))
dashIdRoute.Post("/restore", bind(dtos.RestoreDashboardVersionCommand{}), Wrap(RestoreDashboardVersion))
dashIdRoute.Post("/restore", bind(dtos.RestoreDashboardVersionCommand{}), Wrap(hs.RestoreDashboardVersion))
dashIdRoute.Group("/permissions", func(dashboardPermissionRoute routing.RouteRegister) {
dashboardPermissionRoute.Get("/", Wrap(GetDashboardPermissionList))

View File

@ -18,7 +18,6 @@ import (
m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/guardian"
"github.com/grafana/grafana/pkg/services/quota"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util"
)
@ -208,14 +207,14 @@ func DeleteDashboardByUID(c *m.ReqContext) Response {
})
}
func PostDashboard(c *m.ReqContext, cmd m.SaveDashboardCommand) Response {
func (hs *HTTPServer) PostDashboard(c *m.ReqContext, cmd m.SaveDashboardCommand) Response {
cmd.OrgId = c.OrgId
cmd.UserId = c.UserId
dash := cmd.GetDashboardModel()
if dash.Id == 0 && dash.Uid == "" {
limitReached, err := quota.QuotaReached(c, "dashboard")
limitReached, err := hs.QuotaService.QuotaReached(c, "dashboard")
if err != nil {
return Error(500, "failed to get quota", err)
}
@ -463,7 +462,7 @@ func CalculateDashboardDiff(c *m.ReqContext, apiOptions dtos.CalculateDiffOption
}
// RestoreDashboardVersion restores a dashboard to the given version.
func RestoreDashboardVersion(c *m.ReqContext, apiCmd dtos.RestoreDashboardVersionCommand) Response {
func (hs *HTTPServer) RestoreDashboardVersion(c *m.ReqContext, apiCmd dtos.RestoreDashboardVersionCommand) Response {
dash, rsp := getDashboardHelper(c.OrgId, "", c.ParamsInt64(":dashboardId"), "")
if rsp != nil {
return rsp
@ -490,7 +489,7 @@ func RestoreDashboardVersion(c *m.ReqContext, apiCmd dtos.RestoreDashboardVersio
saveCmd.Dashboard.Set("uid", dash.Uid)
saveCmd.Message = fmt.Sprintf("Restored from version %d", version.Version)
return PostDashboard(c, saveCmd)
return hs.PostDashboard(c, saveCmd)
}
func GetDashboardTags(c *m.ReqContext) {

View File

@ -881,12 +881,16 @@ func postDashboardScenario(desc string, url string, routePattern string, mock *d
Convey(desc+" "+url, func() {
defer bus.ClearBusHandlers()
hs := HTTPServer{
Bus: bus.GetBus(),
}
sc := setupScenarioContext(url)
sc.defaultHandler = Wrap(func(c *m.ReqContext) Response {
sc.context = c
sc.context.SignedInUser = &m.SignedInUser{OrgId: cmd.OrgId, UserId: cmd.UserId}
return PostDashboard(c, cmd)
return hs.PostDashboard(c, cmd)
})
origNewDashboardService := dashboards.NewService

View File

@ -24,6 +24,7 @@ import (
"github.com/grafana/grafana/pkg/services/cache"
"github.com/grafana/grafana/pkg/services/datasources"
"github.com/grafana/grafana/pkg/services/hooks"
"github.com/grafana/grafana/pkg/services/quota"
"github.com/grafana/grafana/pkg/services/rendering"
"github.com/grafana/grafana/pkg/services/session"
"github.com/grafana/grafana/pkg/setting"
@ -55,6 +56,7 @@ type HTTPServer struct {
CacheService *cache.CacheService `inject:""`
DatasourceCache datasources.CacheService `inject:""`
AuthTokenService models.UserTokenService `inject:""`
QuotaService *quota.QuotaService `inject:""`
}
func (hs *HTTPServer) Init() error {

View File

@ -54,7 +54,7 @@ func NewDataSourceProxy(ds *m.DataSource, plugin *plugins.DataSourcePlugin, ctx
func newHTTPClient() httpClient {
return &http.Client{
Timeout: time.Duration(setting.DataProxyTimeout) * time.Second,
Timeout: 30 * time.Second,
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
}
}

View File

@ -19,6 +19,7 @@ import (
_ "github.com/grafana/grafana/pkg/services/alerting/conditions"
_ "github.com/grafana/grafana/pkg/services/alerting/notifiers"
"github.com/grafana/grafana/pkg/setting"
_ "github.com/grafana/grafana/pkg/tsdb/azuremonitor"
_ "github.com/grafana/grafana/pkg/tsdb/cloudwatch"
_ "github.com/grafana/grafana/pkg/tsdb/elasticsearch"
_ "github.com/grafana/grafana/pkg/tsdb/graphite"

View File

@ -0,0 +1,54 @@
package usagestats
import (
"context"
"time"
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/services/sqlstore"
"github.com/grafana/grafana/pkg/social"
"github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/registry"
"github.com/grafana/grafana/pkg/setting"
)
var metricsLogger log.Logger = log.New("metrics")
func init() {
registry.RegisterService(&UsageStatsService{})
}
type UsageStatsService struct {
Cfg *setting.Cfg `inject:""`
Bus bus.Bus `inject:""`
SQLStore *sqlstore.SqlStore `inject:""`
oauthProviders map[string]bool
}
func (uss *UsageStatsService) Init() error {
uss.oauthProviders = social.GetOAuthProviders(uss.Cfg)
return nil
}
func (uss *UsageStatsService) Run(ctx context.Context) error {
uss.updateTotalStats()
onceEveryDayTick := time.NewTicker(time.Hour * 24)
everyMinuteTicker := time.NewTicker(time.Minute)
defer onceEveryDayTick.Stop()
defer everyMinuteTicker.Stop()
for {
select {
case <-onceEveryDayTick.C:
uss.sendUsageStats(uss.oauthProviders)
case <-everyMinuteTicker.C:
uss.updateTotalStats()
case <-ctx.Done():
return ctx.Err()
}
}
}

View File

@ -0,0 +1,177 @@
package usagestats
import (
"bytes"
"encoding/json"
"fmt"
"net/http"
"runtime"
"strings"
"time"
"github.com/grafana/grafana/pkg/metrics"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/setting"
)
var usageStatsURL = "https://stats.grafana.org/grafana-usage-report"
func (uss *UsageStatsService) sendUsageStats(oauthProviders map[string]bool) {
if !setting.ReportingEnabled {
return
}
metricsLogger.Debug(fmt.Sprintf("Sending anonymous usage stats to %s", usageStatsURL))
version := strings.Replace(setting.BuildVersion, ".", "_", -1)
metrics := map[string]interface{}{}
report := map[string]interface{}{
"version": version,
"metrics": metrics,
"os": runtime.GOOS,
"arch": runtime.GOARCH,
"edition": getEdition(),
"packaging": setting.Packaging,
}
statsQuery := models.GetSystemStatsQuery{}
if err := uss.Bus.Dispatch(&statsQuery); err != nil {
metricsLogger.Error("Failed to get system stats", "error", err)
return
}
metrics["stats.dashboards.count"] = statsQuery.Result.Dashboards
metrics["stats.users.count"] = statsQuery.Result.Users
metrics["stats.orgs.count"] = statsQuery.Result.Orgs
metrics["stats.playlist.count"] = statsQuery.Result.Playlists
metrics["stats.plugins.apps.count"] = len(plugins.Apps)
metrics["stats.plugins.panels.count"] = len(plugins.Panels)
metrics["stats.plugins.datasources.count"] = len(plugins.DataSources)
metrics["stats.alerts.count"] = statsQuery.Result.Alerts
metrics["stats.active_users.count"] = statsQuery.Result.ActiveUsers
metrics["stats.datasources.count"] = statsQuery.Result.Datasources
metrics["stats.stars.count"] = statsQuery.Result.Stars
metrics["stats.folders.count"] = statsQuery.Result.Folders
metrics["stats.dashboard_permissions.count"] = statsQuery.Result.DashboardPermissions
metrics["stats.folder_permissions.count"] = statsQuery.Result.FolderPermissions
metrics["stats.provisioned_dashboards.count"] = statsQuery.Result.ProvisionedDashboards
metrics["stats.snapshots.count"] = statsQuery.Result.Snapshots
metrics["stats.teams.count"] = statsQuery.Result.Teams
metrics["stats.total_auth_token.count"] = statsQuery.Result.AuthTokens
userCount := statsQuery.Result.Users
avgAuthTokensPerUser := statsQuery.Result.AuthTokens
if userCount != 0 {
avgAuthTokensPerUser = avgAuthTokensPerUser / userCount
}
metrics["stats.avg_auth_token_per_user.count"] = avgAuthTokensPerUser
dsStats := models.GetDataSourceStatsQuery{}
if err := uss.Bus.Dispatch(&dsStats); err != nil {
metricsLogger.Error("Failed to get datasource stats", "error", err)
return
}
// send counters for each data source
// but ignore any custom data sources
// as sending that name could be sensitive information
dsOtherCount := 0
for _, dsStat := range dsStats.Result {
if models.IsKnownDataSourcePlugin(dsStat.Type) {
metrics["stats.ds."+dsStat.Type+".count"] = dsStat.Count
} else {
dsOtherCount += dsStat.Count
}
}
metrics["stats.ds.other.count"] = dsOtherCount
metrics["stats.packaging."+setting.Packaging+".count"] = 1
dsAccessStats := models.GetDataSourceAccessStatsQuery{}
if err := uss.Bus.Dispatch(&dsAccessStats); err != nil {
metricsLogger.Error("Failed to get datasource access stats", "error", err)
return
}
// send access counters for each data source
// but ignore any custom data sources
// as sending that name could be sensitive information
dsAccessOtherCount := make(map[string]int64)
for _, dsAccessStat := range dsAccessStats.Result {
if dsAccessStat.Access == "" {
continue
}
access := strings.ToLower(dsAccessStat.Access)
if models.IsKnownDataSourcePlugin(dsAccessStat.Type) {
metrics["stats.ds_access."+dsAccessStat.Type+"."+access+".count"] = dsAccessStat.Count
} else {
old := dsAccessOtherCount[access]
dsAccessOtherCount[access] = old + dsAccessStat.Count
}
}
for access, count := range dsAccessOtherCount {
metrics["stats.ds_access.other."+access+".count"] = count
}
anStats := models.GetAlertNotifierUsageStatsQuery{}
if err := uss.Bus.Dispatch(&anStats); err != nil {
metricsLogger.Error("Failed to get alert notification stats", "error", err)
return
}
for _, stats := range anStats.Result {
metrics["stats.alert_notifiers."+stats.Type+".count"] = stats.Count
}
authTypes := map[string]bool{}
authTypes["anonymous"] = setting.AnonymousEnabled
authTypes["basic_auth"] = setting.BasicAuthEnabled
authTypes["ldap"] = setting.LdapEnabled
authTypes["auth_proxy"] = setting.AuthProxyEnabled
for provider, enabled := range oauthProviders {
authTypes["oauth_"+provider] = enabled
}
for authType, enabled := range authTypes {
enabledValue := 0
if enabled {
enabledValue = 1
}
metrics["stats.auth_enabled."+authType+".count"] = enabledValue
}
out, _ := json.MarshalIndent(report, "", " ")
data := bytes.NewBuffer(out)
client := http.Client{Timeout: 5 * time.Second}
go client.Post(usageStatsURL, "application/json", data)
}
func (uss *UsageStatsService) updateTotalStats() {
statsQuery := models.GetSystemStatsQuery{}
if err := uss.Bus.Dispatch(&statsQuery); err != nil {
metricsLogger.Error("Failed to get system stats", "error", err)
return
}
metrics.M_StatTotal_Dashboards.Set(float64(statsQuery.Result.Dashboards))
metrics.M_StatTotal_Users.Set(float64(statsQuery.Result.Users))
metrics.M_StatActive_Users.Set(float64(statsQuery.Result.ActiveUsers))
metrics.M_StatTotal_Playlists.Set(float64(statsQuery.Result.Playlists))
metrics.M_StatTotal_Orgs.Set(float64(statsQuery.Result.Orgs))
}
func getEdition() string {
if setting.IsEnterprise {
return "enterprise"
} else {
return "oss"
}
}

View File

@ -1,4 +1,4 @@
package metrics
package usagestats
import (
"bytes"
@ -15,14 +15,21 @@ import (
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/sqlstore"
"github.com/grafana/grafana/pkg/setting"
. "github.com/smartystreets/goconvey/convey"
)
func TestMetrics(t *testing.T) {
Convey("Test send usage stats", t, func() {
uss := &UsageStatsService{
Bus: bus.New(),
SQLStore: sqlstore.InitTestDB(t),
}
var getSystemStatsQuery *models.GetSystemStatsQuery
bus.AddHandler("test", func(query *models.GetSystemStatsQuery) error {
uss.Bus.AddHandler(func(query *models.GetSystemStatsQuery) error {
query.Result = &models.SystemStats{
Dashboards: 1,
Datasources: 2,
@ -38,13 +45,14 @@ func TestMetrics(t *testing.T) {
ProvisionedDashboards: 12,
Snapshots: 13,
Teams: 14,
AuthTokens: 15,
}
getSystemStatsQuery = query
return nil
})
var getDataSourceStatsQuery *models.GetDataSourceStatsQuery
bus.AddHandler("test", func(query *models.GetDataSourceStatsQuery) error {
uss.Bus.AddHandler(func(query *models.GetDataSourceStatsQuery) error {
query.Result = []*models.DataSourceStats{
{
Type: models.DS_ES,
@ -68,7 +76,7 @@ func TestMetrics(t *testing.T) {
})
var getDataSourceAccessStatsQuery *models.GetDataSourceAccessStatsQuery
bus.AddHandler("test", func(query *models.GetDataSourceAccessStatsQuery) error {
uss.Bus.AddHandler(func(query *models.GetDataSourceAccessStatsQuery) error {
query.Result = []*models.DataSourceAccessStats{
{
Type: models.DS_ES,
@ -116,7 +124,7 @@ func TestMetrics(t *testing.T) {
})
var getAlertNotifierUsageStatsQuery *models.GetAlertNotifierUsageStatsQuery
bus.AddHandler("test", func(query *models.GetAlertNotifierUsageStatsQuery) error {
uss.Bus.AddHandler(func(query *models.GetAlertNotifierUsageStatsQuery) error {
query.Result = []*models.NotifierUsageStats{
{
Type: "slack",
@ -155,11 +163,11 @@ func TestMetrics(t *testing.T) {
"grafana_com": true,
}
sendUsageStats(oauthProviders)
uss.sendUsageStats(oauthProviders)
Convey("Given reporting not enabled and sending usage stats", func() {
setting.ReportingEnabled = false
sendUsageStats(oauthProviders)
uss.sendUsageStats(oauthProviders)
Convey("Should not gather stats or call http endpoint", func() {
So(getSystemStatsQuery, ShouldBeNil)
@ -179,7 +187,7 @@ func TestMetrics(t *testing.T) {
setting.Packaging = "deb"
wg.Add(1)
sendUsageStats(oauthProviders)
uss.sendUsageStats(oauthProviders)
Convey("Should gather stats and call http endpoint", func() {
if waitTimeout(&wg, 2*time.Second) {
@ -221,6 +229,8 @@ func TestMetrics(t *testing.T) {
So(metrics.Get("stats.provisioned_dashboards.count").MustInt(), ShouldEqual, getSystemStatsQuery.Result.ProvisionedDashboards)
So(metrics.Get("stats.snapshots.count").MustInt(), ShouldEqual, getSystemStatsQuery.Result.Snapshots)
So(metrics.Get("stats.teams.count").MustInt(), ShouldEqual, getSystemStatsQuery.Result.Teams)
So(metrics.Get("stats.total_auth_token.count").MustInt64(), ShouldEqual, 15)
So(metrics.Get("stats.avg_auth_token_per_user.count").MustInt64(), ShouldEqual, 5)
So(metrics.Get("stats.ds."+models.DS_ES+".count").MustInt(), ShouldEqual, 9)
So(metrics.Get("stats.ds."+models.DS_PROMETHEUS+".count").MustInt(), ShouldEqual, 10)
@ -246,6 +256,7 @@ func TestMetrics(t *testing.T) {
So(metrics.Get("stats.auth_enabled.oauth_grafana_com.count").MustInt(), ShouldEqual, 1)
So(metrics.Get("stats.packaging.deb.count").MustInt(), ShouldEqual, 1)
})
})

View File

@ -4,18 +4,30 @@ import (
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/log"
m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/registry"
"github.com/grafana/grafana/pkg/services/quota"
)
func init() {
bus.AddHandler("auth", UpsertUser)
registry.RegisterService(&LoginService{})
}
var (
logger = log.New("login.ext_user")
)
func UpsertUser(cmd *m.UpsertUserCommand) error {
type LoginService struct {
Bus bus.Bus `inject:""`
QuotaService *quota.QuotaService `inject:""`
}
func (ls *LoginService) Init() error {
ls.Bus.AddHandler(ls.UpsertUser)
return nil
}
func (ls *LoginService) UpsertUser(cmd *m.UpsertUserCommand) error {
extUser := cmd.ExternalUser
userQuery := &m.GetUserByAuthInfoQuery{
@ -37,7 +49,7 @@ func UpsertUser(cmd *m.UpsertUserCommand) error {
return ErrInvalidCredentials
}
limitReached, err := quota.QuotaReached(cmd.ReqContext, "user")
limitReached, err := ls.QuotaService.QuotaReached(cmd.ReqContext, "user")
if err != nil {
log.Warn("Error getting user quota. error: %v", err)
return ErrGettingUserQuota
@ -57,7 +69,7 @@ func UpsertUser(cmd *m.UpsertUserCommand) error {
AuthModule: extUser.AuthModule,
AuthId: extUser.AuthId,
}
if err := bus.Dispatch(cmd2); err != nil {
if err := ls.Bus.Dispatch(cmd2); err != nil {
return err
}
}
@ -78,12 +90,12 @@ func UpsertUser(cmd *m.UpsertUserCommand) error {
// Sync isGrafanaAdmin permission
if extUser.IsGrafanaAdmin != nil && *extUser.IsGrafanaAdmin != cmd.Result.IsAdmin {
if err := bus.Dispatch(&m.UpdateUserPermissionsCommand{UserId: cmd.Result.Id, IsGrafanaAdmin: *extUser.IsGrafanaAdmin}); err != nil {
if err := ls.Bus.Dispatch(&m.UpdateUserPermissionsCommand{UserId: cmd.Result.Id, IsGrafanaAdmin: *extUser.IsGrafanaAdmin}); err != nil {
return err
}
}
err = bus.Dispatch(&m.SyncTeamsCommand{
err = ls.Bus.Dispatch(&m.SyncTeamsCommand{
User: cmd.Result,
ExternalUser: extUser,
})

View File

@ -395,8 +395,11 @@ func ldapAutherScenario(desc string, fn scenarioFunc) {
defer bus.ClearBusHandlers()
sc := &scenarioContext{}
loginService := &LoginService{
Bus: bus.GetBus(),
}
bus.AddHandler("test", UpsertUser)
bus.AddHandler("test", loginService.UpsertUser)
bus.AddHandlerCtx("test", func(ctx context.Context, cmd *m.SyncTeamsCommand) error {
return nil

View File

@ -1,17 +1,10 @@
package metrics
import (
"bytes"
"encoding/json"
"net/http"
"runtime"
"strings"
"time"
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/setting"
"github.com/prometheus/client_golang/prometheus"
)
@ -68,23 +61,6 @@ var (
grafanaBuildVersion *prometheus.GaugeVec
)
func newCounterVecStartingAtZero(opts prometheus.CounterOpts, labels []string, labelValues ...string) *prometheus.CounterVec {
counter := prometheus.NewCounterVec(opts, labels)
for _, label := range labelValues {
counter.WithLabelValues(label).Add(0)
}
return counter
}
func newCounterStartingAtZero(opts prometheus.CounterOpts, labelValues ...string) prometheus.Counter {
counter := prometheus.NewCounter(opts)
counter.Add(0)
return counter
}
func init() {
M_Instance_Start = prometheus.NewCounter(prometheus.CounterOpts{
Name: "instance_start_total",
@ -308,7 +284,7 @@ func init() {
Name: "build_info",
Help: "A metric with a constant '1' value labeled by version, revision, branch, and goversion from which Grafana was built.",
Namespace: exporterName,
}, []string{"version", "revision", "branch", "goversion"})
}, []string{"version", "revision", "branch", "goversion", "edition"})
}
// SetBuildInformation sets the build information for this binary
@ -317,8 +293,13 @@ func SetBuildInformation(version, revision, branch string) {
// Once this have been released for some time we should be able to remote `M_Grafana_Version`
// The reason we added a new one is that its common practice in the prometheus community
// to name this metric `*_build_info` so its easy to do aggregation on all programs.
edition := "oss"
if setting.IsEnterprise {
edition = "enterprise"
}
M_Grafana_Version.WithLabelValues(version).Set(1)
grafanaBuildVersion.WithLabelValues(version, revision, branch, runtime.Version()).Set(1)
grafanaBuildVersion.WithLabelValues(version, revision, branch, runtime.Version(), edition).Set(1)
}
func initMetricVars() {
@ -362,154 +343,19 @@ func initMetricVars() {
}
func updateTotalStats() {
statsQuery := models.GetSystemStatsQuery{}
if err := bus.Dispatch(&statsQuery); err != nil {
metricsLogger.Error("Failed to get system stats", "error", err)
return
func newCounterVecStartingAtZero(opts prometheus.CounterOpts, labels []string, labelValues ...string) *prometheus.CounterVec {
counter := prometheus.NewCounterVec(opts, labels)
for _, label := range labelValues {
counter.WithLabelValues(label).Add(0)
}
M_StatTotal_Dashboards.Set(float64(statsQuery.Result.Dashboards))
M_StatTotal_Users.Set(float64(statsQuery.Result.Users))
M_StatActive_Users.Set(float64(statsQuery.Result.ActiveUsers))
M_StatTotal_Playlists.Set(float64(statsQuery.Result.Playlists))
M_StatTotal_Orgs.Set(float64(statsQuery.Result.Orgs))
return counter
}
var usageStatsURL = "https://stats.grafana.org/grafana-usage-report"
func newCounterStartingAtZero(opts prometheus.CounterOpts, labelValues ...string) prometheus.Counter {
counter := prometheus.NewCounter(opts)
counter.Add(0)
func getEdition() string {
if setting.IsEnterprise {
return "enterprise"
} else {
return "oss"
}
}
func sendUsageStats(oauthProviders map[string]bool) {
if !setting.ReportingEnabled {
return
}
metricsLogger.Debug("Sending anonymous usage stats to stats.grafana.org")
version := strings.Replace(setting.BuildVersion, ".", "_", -1)
metrics := map[string]interface{}{}
report := map[string]interface{}{
"version": version,
"metrics": metrics,
"os": runtime.GOOS,
"arch": runtime.GOARCH,
"edition": getEdition(),
"packaging": setting.Packaging,
}
statsQuery := models.GetSystemStatsQuery{}
if err := bus.Dispatch(&statsQuery); err != nil {
metricsLogger.Error("Failed to get system stats", "error", err)
return
}
metrics["stats.dashboards.count"] = statsQuery.Result.Dashboards
metrics["stats.users.count"] = statsQuery.Result.Users
metrics["stats.orgs.count"] = statsQuery.Result.Orgs
metrics["stats.playlist.count"] = statsQuery.Result.Playlists
metrics["stats.plugins.apps.count"] = len(plugins.Apps)
metrics["stats.plugins.panels.count"] = len(plugins.Panels)
metrics["stats.plugins.datasources.count"] = len(plugins.DataSources)
metrics["stats.alerts.count"] = statsQuery.Result.Alerts
metrics["stats.active_users.count"] = statsQuery.Result.ActiveUsers
metrics["stats.datasources.count"] = statsQuery.Result.Datasources
metrics["stats.stars.count"] = statsQuery.Result.Stars
metrics["stats.folders.count"] = statsQuery.Result.Folders
metrics["stats.dashboard_permissions.count"] = statsQuery.Result.DashboardPermissions
metrics["stats.folder_permissions.count"] = statsQuery.Result.FolderPermissions
metrics["stats.provisioned_dashboards.count"] = statsQuery.Result.ProvisionedDashboards
metrics["stats.snapshots.count"] = statsQuery.Result.Snapshots
metrics["stats.teams.count"] = statsQuery.Result.Teams
dsStats := models.GetDataSourceStatsQuery{}
if err := bus.Dispatch(&dsStats); err != nil {
metricsLogger.Error("Failed to get datasource stats", "error", err)
return
}
// send counters for each data source
// but ignore any custom data sources
// as sending that name could be sensitive information
dsOtherCount := 0
for _, dsStat := range dsStats.Result {
if models.IsKnownDataSourcePlugin(dsStat.Type) {
metrics["stats.ds."+dsStat.Type+".count"] = dsStat.Count
} else {
dsOtherCount += dsStat.Count
}
}
metrics["stats.ds.other.count"] = dsOtherCount
metrics["stats.packaging."+setting.Packaging+".count"] = 1
dsAccessStats := models.GetDataSourceAccessStatsQuery{}
if err := bus.Dispatch(&dsAccessStats); err != nil {
metricsLogger.Error("Failed to get datasource access stats", "error", err)
return
}
// send access counters for each data source
// but ignore any custom data sources
// as sending that name could be sensitive information
dsAccessOtherCount := make(map[string]int64)
for _, dsAccessStat := range dsAccessStats.Result {
if dsAccessStat.Access == "" {
continue
}
access := strings.ToLower(dsAccessStat.Access)
if models.IsKnownDataSourcePlugin(dsAccessStat.Type) {
metrics["stats.ds_access."+dsAccessStat.Type+"."+access+".count"] = dsAccessStat.Count
} else {
old := dsAccessOtherCount[access]
dsAccessOtherCount[access] = old + dsAccessStat.Count
}
}
for access, count := range dsAccessOtherCount {
metrics["stats.ds_access.other."+access+".count"] = count
}
anStats := models.GetAlertNotifierUsageStatsQuery{}
if err := bus.Dispatch(&anStats); err != nil {
metricsLogger.Error("Failed to get alert notification stats", "error", err)
return
}
for _, stats := range anStats.Result {
metrics["stats.alert_notifiers."+stats.Type+".count"] = stats.Count
}
authTypes := map[string]bool{}
authTypes["anonymous"] = setting.AnonymousEnabled
authTypes["basic_auth"] = setting.BasicAuthEnabled
authTypes["ldap"] = setting.LdapEnabled
authTypes["auth_proxy"] = setting.AuthProxyEnabled
for provider, enabled := range oauthProviders {
authTypes["oauth_"+provider] = enabled
}
for authType, enabled := range authTypes {
enabledValue := 0
if enabled {
enabledValue = 1
}
metrics["stats.auth_enabled."+authType+".count"] = enabledValue
}
out, _ := json.MarshalIndent(report, "", " ")
data := bytes.NewBuffer(out)
client := http.Client{Timeout: 5 * time.Second}
go client.Post(usageStatsURL, "application/json", data)
return counter
}

View File

@ -2,7 +2,6 @@ package metrics
import (
"context"
"time"
"github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/metrics/graphitebridge"
@ -30,7 +29,6 @@ type InternalMetricsService struct {
intervalSeconds int64
graphiteCfg *graphitebridge.Config
oauthProviders map[string]bool
}
func (im *InternalMetricsService) Init() error {
@ -50,22 +48,6 @@ func (im *InternalMetricsService) Run(ctx context.Context) error {
M_Instance_Start.Inc()
// set the total stats gauges before we publishing metrics
updateTotalStats()
onceEveryDayTick := time.NewTicker(time.Hour * 24)
everyMinuteTicker := time.NewTicker(time.Minute)
defer onceEveryDayTick.Stop()
defer everyMinuteTicker.Stop()
for {
select {
case <-onceEveryDayTick.C:
sendUsageStats(im.oauthProviders)
case <-everyMinuteTicker.C:
updateTotalStats()
case <-ctx.Done():
return ctx.Err()
}
}
<-ctx.Done()
return ctx.Err()
}

View File

@ -5,8 +5,6 @@ import (
"strings"
"time"
"github.com/grafana/grafana/pkg/social"
"github.com/grafana/grafana/pkg/metrics/graphitebridge"
"github.com/grafana/grafana/pkg/setting"
"github.com/prometheus/client_golang/prometheus"
@ -24,8 +22,6 @@ func (im *InternalMetricsService) readSettings() error {
return fmt.Errorf("Unable to parse metrics graphite section, %v", err)
}
im.oauthProviders = social.GetOAuthProviders(im.Cfg)
return nil
}

View File

@ -682,6 +682,7 @@ type fakeUserAuthTokenService struct {
tryRotateTokenProvider func(token *m.UserToken, clientIP, userAgent string) (bool, error)
lookupTokenProvider func(unhashedToken string) (*m.UserToken, error)
revokeTokenProvider func(token *m.UserToken) error
activeAuthTokenCount func() (int64, error)
}
func newFakeUserAuthTokenService() *fakeUserAuthTokenService {
@ -704,6 +705,9 @@ func newFakeUserAuthTokenService() *fakeUserAuthTokenService {
revokeTokenProvider: func(token *m.UserToken) error {
return nil
},
activeAuthTokenCount: func() (int64, error) {
return 10, nil
},
}
}
@ -722,3 +726,7 @@ func (s *fakeUserAuthTokenService) TryRotateToken(token *m.UserToken, clientIP,
func (s *fakeUserAuthTokenService) RevokeToken(token *m.UserToken) error {
return s.revokeTokenProvider(token)
}
func (s *fakeUserAuthTokenService) ActiveTokenCount() (int64, error) {
return s.activeAuthTokenCount()
}

View File

@ -9,16 +9,20 @@ import (
"github.com/grafana/grafana/pkg/services/quota"
)
func Quota(target string) macaron.Handler {
return func(c *m.ReqContext) {
limitReached, err := quota.QuotaReached(c, target)
if err != nil {
c.JsonApiErr(500, "failed to get quota", err)
return
}
if limitReached {
c.JsonApiErr(403, fmt.Sprintf("%s Quota reached", target), nil)
return
// Quota returns a function that returns a function used to call quotaservice based on target name
func Quota(quotaService *quota.QuotaService) func(target string) macaron.Handler {
//https://open.spotify.com/track/7bZSoBEAEEUsGEuLOf94Jm?si=T1Tdju5qRSmmR0zph_6RBw fuuuuunky
return func(target string) macaron.Handler {
return func(c *m.ReqContext) {
limitReached, err := quotaService.QuotaReached(c, target)
if err != nil {
c.JsonApiErr(500, "failed to get quota", err)
return
}
if limitReached {
c.JsonApiErr(403, fmt.Sprintf("%s Quota reached", target), nil)
return
}
}
}
}

View File

@ -3,9 +3,10 @@ package middleware
import (
"testing"
"github.com/grafana/grafana/pkg/services/quota"
"github.com/grafana/grafana/pkg/bus"
m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/session"
"github.com/grafana/grafana/pkg/setting"
. "github.com/smartystreets/goconvey/convey"
)
@ -13,10 +14,6 @@ import (
func TestMiddlewareQuota(t *testing.T) {
Convey("Given the grafana quota middleware", t, func() {
session.GetSessionCount = func() int {
return 4
}
setting.AnonymousEnabled = false
setting.Quota = setting.QuotaSettings{
Enabled: true,
@ -39,6 +36,12 @@ func TestMiddlewareQuota(t *testing.T) {
},
}
fakeAuthTokenService := newFakeUserAuthTokenService()
qs := &quota.QuotaService{
AuthTokenService: fakeAuthTokenService,
}
QuotaFn := Quota(qs)
middlewareScenario("with user not logged in", func(sc *scenarioContext) {
bus.AddHandler("globalQuota", func(query *m.GetGlobalQuotaByTargetQuery) error {
query.Result = &m.GlobalQuotaDTO{
@ -48,26 +51,30 @@ func TestMiddlewareQuota(t *testing.T) {
}
return nil
})
Convey("global quota not reached", func() {
sc.m.Get("/user", Quota("user"), sc.defaultHandler)
sc.m.Get("/user", QuotaFn("user"), sc.defaultHandler)
sc.fakeReq("GET", "/user").exec()
So(sc.resp.Code, ShouldEqual, 200)
})
Convey("global quota reached", func() {
setting.Quota.Global.User = 4
sc.m.Get("/user", Quota("user"), sc.defaultHandler)
sc.m.Get("/user", QuotaFn("user"), sc.defaultHandler)
sc.fakeReq("GET", "/user").exec()
So(sc.resp.Code, ShouldEqual, 403)
})
Convey("global session quota not reached", func() {
setting.Quota.Global.Session = 10
sc.m.Get("/user", Quota("session"), sc.defaultHandler)
sc.m.Get("/user", QuotaFn("session"), sc.defaultHandler)
sc.fakeReq("GET", "/user").exec()
So(sc.resp.Code, ShouldEqual, 200)
})
Convey("global session quota reached", func() {
setting.Quota.Global.Session = 1
sc.m.Get("/user", Quota("session"), sc.defaultHandler)
sc.m.Get("/user", QuotaFn("session"), sc.defaultHandler)
sc.fakeReq("GET", "/user").exec()
So(sc.resp.Code, ShouldEqual, 403)
})
@ -95,6 +102,7 @@ func TestMiddlewareQuota(t *testing.T) {
}
return nil
})
bus.AddHandler("userQuota", func(query *m.GetUserQuotaByTargetQuery) error {
query.Result = &m.UserQuotaDTO{
Target: query.Target,
@ -103,6 +111,7 @@ func TestMiddlewareQuota(t *testing.T) {
}
return nil
})
bus.AddHandler("orgQuota", func(query *m.GetOrgQuotaByTargetQuery) error {
query.Result = &m.OrgQuotaDTO{
Target: query.Target,
@ -111,45 +120,49 @@ func TestMiddlewareQuota(t *testing.T) {
}
return nil
})
Convey("global datasource quota reached", func() {
setting.Quota.Global.DataSource = 4
sc.m.Get("/ds", Quota("data_source"), sc.defaultHandler)
sc.m.Get("/ds", QuotaFn("data_source"), sc.defaultHandler)
sc.fakeReq("GET", "/ds").exec()
So(sc.resp.Code, ShouldEqual, 403)
})
Convey("user Org quota not reached", func() {
setting.Quota.User.Org = 5
sc.m.Get("/org", Quota("org"), sc.defaultHandler)
sc.m.Get("/org", QuotaFn("org"), sc.defaultHandler)
sc.fakeReq("GET", "/org").exec()
So(sc.resp.Code, ShouldEqual, 200)
})
Convey("user Org quota reached", func() {
setting.Quota.User.Org = 4
sc.m.Get("/org", Quota("org"), sc.defaultHandler)
sc.m.Get("/org", QuotaFn("org"), sc.defaultHandler)
sc.fakeReq("GET", "/org").exec()
So(sc.resp.Code, ShouldEqual, 403)
})
Convey("org dashboard quota not reached", func() {
setting.Quota.Org.Dashboard = 10
sc.m.Get("/dashboard", Quota("dashboard"), sc.defaultHandler)
sc.m.Get("/dashboard", QuotaFn("dashboard"), sc.defaultHandler)
sc.fakeReq("GET", "/dashboard").exec()
So(sc.resp.Code, ShouldEqual, 200)
})
Convey("org dashboard quota reached", func() {
setting.Quota.Org.Dashboard = 4
sc.m.Get("/dashboard", Quota("dashboard"), sc.defaultHandler)
sc.m.Get("/dashboard", QuotaFn("dashboard"), sc.defaultHandler)
sc.fakeReq("GET", "/dashboard").exec()
So(sc.resp.Code, ShouldEqual, 403)
})
Convey("org dashboard quota reached but quotas disabled", func() {
setting.Quota.Org.Dashboard = 4
setting.Quota.Enabled = false
sc.m.Get("/dashboard", Quota("dashboard"), sc.defaultHandler)
sc.m.Get("/dashboard", QuotaFn("dashboard"), sc.defaultHandler)
sc.fakeReq("GET", "/dashboard").exec()
So(sc.resp.Code, ShouldEqual, 200)
})
})
})
}

View File

@ -6,7 +6,6 @@ import (
"github.com/grafana/grafana/pkg/bus"
m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/session"
"github.com/grafana/grafana/pkg/setting"
. "github.com/smartystreets/goconvey/convey"
macaron "gopkg.in/macaron.v1"
@ -66,7 +65,6 @@ func recoveryScenario(desc string, url string, fn scenarioFunc) {
sc.userAuthTokenService = newFakeUserAuthTokenService()
sc.m.Use(GetContextHandler(sc.userAuthTokenService))
// mock out gc goroutine
session.StartSessionGC = func() {}
sc.m.Use(OrgRedirect())
sc.m.Use(AddDefaultResponseHeaders())

View File

@ -23,7 +23,7 @@ const (
DS_ACCESS_DIRECT = "direct"
DS_ACCESS_PROXY = "proxy"
DS_STACKDRIVER = "stackdriver"
DS_AZURE_MONITOR = "azure-monitor"
DS_AZURE_MONITOR = "grafana-azure-monitor-datasource"
)
var (

View File

@ -8,6 +8,8 @@ import (
"net/http"
"sync"
"time"
"github.com/grafana/grafana/pkg/setting"
)
type proxyTransportCache struct {
@ -46,21 +48,18 @@ func (ds *DataSource) GetHttpTransport() (*http.Transport, error) {
return t.Transport, nil
}
var tlsSkipVerify, tlsClientAuth, tlsAuthWithCACert bool
if ds.JsonData != nil {
tlsClientAuth = ds.JsonData.Get("tlsAuth").MustBool(false)
tlsAuthWithCACert = ds.JsonData.Get("tlsAuthWithCACert").MustBool(false)
tlsSkipVerify = ds.JsonData.Get("tlsSkipVerify").MustBool(false)
tlsConfig, err := ds.GetTLSConfig()
if err != nil {
return nil, err
}
tlsConfig.Renegotiation = tls.RenegotiateFreelyAsClient
transport := &http.Transport{
TLSClientConfig: &tls.Config{
InsecureSkipVerify: tlsSkipVerify,
Renegotiation: tls.RenegotiateFreelyAsClient,
},
Proxy: http.ProxyFromEnvironment,
TLSClientConfig: tlsConfig,
Proxy: http.ProxyFromEnvironment,
Dial: (&net.Dialer{
Timeout: 30 * time.Second,
Timeout: time.Duration(setting.DataProxyTimeout) * time.Second,
KeepAlive: 30 * time.Second,
DualStack: true,
}).Dial,
@ -70,6 +69,26 @@ func (ds *DataSource) GetHttpTransport() (*http.Transport, error) {
IdleConnTimeout: 90 * time.Second,
}
ptc.cache[ds.Id] = cachedTransport{
Transport: transport,
updated: ds.Updated,
}
return transport, nil
}
func (ds *DataSource) GetTLSConfig() (*tls.Config, error) {
var tlsSkipVerify, tlsClientAuth, tlsAuthWithCACert bool
if ds.JsonData != nil {
tlsClientAuth = ds.JsonData.Get("tlsAuth").MustBool(false)
tlsAuthWithCACert = ds.JsonData.Get("tlsAuthWithCACert").MustBool(false)
tlsSkipVerify = ds.JsonData.Get("tlsSkipVerify").MustBool(false)
}
tlsConfig := &tls.Config{
InsecureSkipVerify: tlsSkipVerify,
}
if tlsClientAuth || tlsAuthWithCACert {
decrypted := ds.SecureJsonData.Decrypt()
if tlsAuthWithCACert && len(decrypted["tlsCACert"]) > 0 {
@ -78,7 +97,7 @@ func (ds *DataSource) GetHttpTransport() (*http.Transport, error) {
if !ok {
return nil, errors.New("Failed to parse TLS CA PEM certificate")
}
transport.TLSClientConfig.RootCAs = caPool
tlsConfig.RootCAs = caPool
}
if tlsClientAuth {
@ -86,14 +105,9 @@ func (ds *DataSource) GetHttpTransport() (*http.Transport, error) {
if err != nil {
return nil, err
}
transport.TLSClientConfig.Certificates = []tls.Certificate{cert}
tlsConfig.Certificates = []tls.Certificate{cert}
}
}
ptc.cache[ds.Id] = cachedTransport{
Transport: transport,
updated: ds.Updated,
}
return transport, nil
return tlsConfig, nil
}

View File

@ -15,6 +15,7 @@ type SystemStats struct {
FolderPermissions int64
Folders int64
ProvisionedDashboards int64
AuthTokens int64
}
type DataSourceStats struct {

View File

@ -29,4 +29,5 @@ type UserTokenService interface {
LookupToken(unhashedToken string) (*UserToken, error)
TryRotateToken(token *UserToken, clientIP, userAgent string) (bool, error)
RevokeToken(token *UserToken) error
ActiveTokenCount() (int64, error)
}

View File

@ -35,6 +35,13 @@ func (s *UserAuthTokenService) Init() error {
return nil
}
func (s *UserAuthTokenService) ActiveTokenCount() (int64, error) {
var model userAuthToken
count, err := s.SQLStore.NewSession().Where(`created_at > ? AND rotated_at > ?`, s.createdAfterParam(), s.rotatedAfterParam()).Count(&model)
return count, err
}
func (s *UserAuthTokenService) CreateToken(userId int64, clientIP, userAgent string) (*models.UserToken, error) {
clientIP = util.ParseIPAddress(clientIP)
token, err := util.RandomHex(16)
@ -79,13 +86,8 @@ func (s *UserAuthTokenService) LookupToken(unhashedToken string) (*models.UserTo
s.log.Debug("looking up token", "unhashed", unhashedToken, "hashed", hashedToken)
}
tokenMaxLifetime := time.Duration(s.Cfg.LoginMaxLifetimeDays) * 24 * time.Hour
tokenMaxInactiveLifetime := time.Duration(s.Cfg.LoginMaxInactiveLifetimeDays) * 24 * time.Hour
createdAfter := getTime().Add(-tokenMaxLifetime).Unix()
rotatedAfter := getTime().Add(-tokenMaxInactiveLifetime).Unix()
var model userAuthToken
exists, err := s.SQLStore.NewSession().Where("(auth_token = ? OR prev_auth_token = ?) AND created_at > ? AND rotated_at > ?", hashedToken, hashedToken, createdAfter, rotatedAfter).Get(&model)
exists, err := s.SQLStore.NewSession().Where("(auth_token = ? OR prev_auth_token = ?) AND created_at > ? AND rotated_at > ?", hashedToken, hashedToken, s.createdAfterParam(), s.rotatedAfterParam()).Get(&model)
if err != nil {
return nil, err
}
@ -219,6 +221,16 @@ func (s *UserAuthTokenService) RevokeToken(token *models.UserToken) error {
return nil
}
func (s *UserAuthTokenService) createdAfterParam() int64 {
tokenMaxLifetime := time.Duration(s.Cfg.LoginMaxLifetimeDays) * 24 * time.Hour
return getTime().Add(-tokenMaxLifetime).Unix()
}
func (s *UserAuthTokenService) rotatedAfterParam() int64 {
tokenMaxInactiveLifetime := time.Duration(s.Cfg.LoginMaxInactiveLifetimeDays) * 24 * time.Hour
return getTime().Add(-tokenMaxInactiveLifetime).Unix()
}
func hashToken(token string) string {
hashBytes := sha256.Sum256([]byte(token + setting.SecretKey))
return hex.EncodeToString(hashBytes[:])

View File

@ -31,6 +31,12 @@ func TestUserAuthToken(t *testing.T) {
So(userToken, ShouldNotBeNil)
So(userToken.AuthTokenSeen, ShouldBeFalse)
Convey("Can count active tokens", func() {
count, err := userAuthTokenService.ActiveTokenCount()
So(err, ShouldBeNil)
So(count, ShouldEqual, 1)
})
Convey("When lookup unhashed token should return user auth token", func() {
userToken, err := userAuthTokenService.LookupToken(userToken.UnhashedToken)
So(err, ShouldBeNil)
@ -114,6 +120,12 @@ func TestUserAuthToken(t *testing.T) {
notGood, err := userAuthTokenService.LookupToken(userToken.UnhashedToken)
So(err, ShouldEqual, models.ErrUserTokenNotFound)
So(notGood, ShouldBeNil)
Convey("should not find active token when expired", func() {
count, err := userAuthTokenService.ActiveTokenCount()
So(err, ShouldBeNil)
So(count, ShouldEqual, 0)
})
})
Convey("when rotated_at is 5 days ago and created_at is 29 days and 23:59:59 ago should not find token", func() {

View File

@ -59,7 +59,7 @@ func (cr *configReader) readConfig() ([]*DashboardsAsConfig, error) {
files, err := ioutil.ReadDir(cr.path)
if err != nil {
cr.log.Error("can't read dashboard provisioning files from directory", "path", cr.path)
cr.log.Error("can't read dashboard provisioning files from directory", "path", cr.path, "error", err)
return dashboards, nil
}

View File

@ -19,7 +19,7 @@ func (cr *configReader) readConfig(path string) ([]*DatasourcesAsConfig, error)
files, err := ioutil.ReadDir(path)
if err != nil {
cr.log.Error("can't read datasource provisioning files from directory", "path", path)
cr.log.Error("can't read datasource provisioning files from directory", "path", path, "error", err)
return datasources, nil
}

View File

@ -23,7 +23,7 @@ func (cr *configReader) readConfig(path string) ([]*notificationsAsConfig, error
files, err := ioutil.ReadDir(path)
if err != nil {
cr.log.Error("Can't read alert notification provisioning files from directory", "path", path)
cr.log.Error("Can't read alert notification provisioning files from directory", "path", path, "error", err)
return notifications, nil
}

View File

@ -3,11 +3,23 @@ package quota
import (
"github.com/grafana/grafana/pkg/bus"
m "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/session"
"github.com/grafana/grafana/pkg/registry"
"github.com/grafana/grafana/pkg/setting"
)
func QuotaReached(c *m.ReqContext, target string) (bool, error) {
func init() {
registry.RegisterService(&QuotaService{})
}
type QuotaService struct {
AuthTokenService m.UserTokenService `inject:""`
}
func (qs *QuotaService) Init() error {
return nil
}
func (qs *QuotaService) QuotaReached(c *m.ReqContext, target string) (bool, error) {
if !setting.Quota.Enabled {
return false, nil
}
@ -30,7 +42,12 @@ func QuotaReached(c *m.ReqContext, target string) (bool, error) {
return true, nil
}
if target == "session" {
usedSessions := session.GetSessionCount()
usedSessions, err := qs.AuthTokenService.ActiveTokenCount()
if err != nil {
return false, err
}
if int64(usedSessions) > scope.DefaultLimit {
c.Logger.Debug("Sessions limit reached", "active", usedSessions, "limit", scope.DefaultLimit)
return true, nil

View File

@ -19,7 +19,7 @@ const (
var sessionManager *ms.Manager
var sessionOptions *ms.Options
var StartSessionGC func()
var StartSessionGC func() = func() {}
var GetSessionCount func() int
var sessionLogger = log.New("session")
var sessionConnMaxLifetime int64

View File

@ -74,7 +74,8 @@ func GetSystemStats(query *m.GetSystemStatsQuery) error {
sb.Write(`(SELECT COUNT(id) FROM ` + dialect.Quote("dashboard_provisioning") + `) AS provisioned_dashboards,`)
sb.Write(`(SELECT COUNT(id) FROM ` + dialect.Quote("dashboard_snapshot") + `) AS snapshots,`)
sb.Write(`(SELECT COUNT(id) FROM ` + dialect.Quote("team") + `) AS teams`)
sb.Write(`(SELECT COUNT(id) FROM ` + dialect.Quote("team") + `) AS teams,`)
sb.Write(`(SELECT COUNT(id) FROM ` + dialect.Quote("user_auth_token") + `) AS auth_tokens`)
var stats m.SystemStats
_, err := x.SQL(sb.GetSqlString(), sb.params...).Get(&stats)

View File

@ -0,0 +1,312 @@
package azuremonitor
import (
"context"
"encoding/json"
"errors"
"fmt"
"io/ioutil"
"net/http"
"net/url"
"path"
"strings"
"time"
"github.com/grafana/grafana/pkg/api/pluginproxy"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/setting"
opentracing "github.com/opentracing/opentracing-go"
"golang.org/x/net/context/ctxhttp"
"github.com/grafana/grafana/pkg/components/null"
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/tsdb"
)
// AzureMonitorDatasource calls the Azure Monitor API - one of the four API's supported
type AzureMonitorDatasource struct {
httpClient *http.Client
dsInfo *models.DataSource
}
var (
// 1m, 5m, 15m, 30m, 1h, 6h, 12h, 1d in milliseconds
allowedIntervalsMS = []int64{60000, 300000, 900000, 1800000, 3600000, 21600000, 43200000, 86400000}
)
// executeTimeSeriesQuery does the following:
// 1. build the AzureMonitor url and querystring for each query
// 2. executes each query by calling the Azure Monitor API
// 3. parses the responses for each query into the timeseries format
func (e *AzureMonitorDatasource) executeTimeSeriesQuery(ctx context.Context, originalQueries []*tsdb.Query, timeRange *tsdb.TimeRange) (*tsdb.Response, error) {
result := &tsdb.Response{
Results: map[string]*tsdb.QueryResult{},
}
queries, err := e.buildQueries(originalQueries, timeRange)
if err != nil {
return nil, err
}
for _, query := range queries {
queryRes, resp, err := e.executeQuery(ctx, query, originalQueries, timeRange)
if err != nil {
return nil, err
}
// azlog.Debug("AzureMonitor", "Response", resp)
err = e.parseResponse(queryRes, resp, query)
if err != nil {
queryRes.Error = err
}
result.Results[query.RefID] = queryRes
}
return result, nil
}
func (e *AzureMonitorDatasource) buildQueries(queries []*tsdb.Query, timeRange *tsdb.TimeRange) ([]*AzureMonitorQuery, error) {
azureMonitorQueries := []*AzureMonitorQuery{}
startTime, err := timeRange.ParseFrom()
if err != nil {
return nil, err
}
endTime, err := timeRange.ParseTo()
if err != nil {
return nil, err
}
for _, query := range queries {
var target string
azureMonitorTarget := query.Model.Get("azureMonitor").MustMap()
azlog.Debug("AzureMonitor", "target", azureMonitorTarget)
urlComponents := map[string]string{}
urlComponents["resourceGroup"] = fmt.Sprintf("%v", azureMonitorTarget["resourceGroup"])
urlComponents["metricDefinition"] = fmt.Sprintf("%v", azureMonitorTarget["metricDefinition"])
urlComponents["resourceName"] = fmt.Sprintf("%v", azureMonitorTarget["resourceName"])
ub := urlBuilder{
ResourceGroup: urlComponents["resourceGroup"],
MetricDefinition: urlComponents["metricDefinition"],
ResourceName: urlComponents["resourceName"],
}
azureURL := ub.Build()
alias := fmt.Sprintf("%v", azureMonitorTarget["alias"])
timeGrain := fmt.Sprintf("%v", azureMonitorTarget["timeGrain"])
if timeGrain == "auto" {
autoInterval := e.findClosestAllowedIntervalMS(query.IntervalMs)
tg := &TimeGrain{}
timeGrain, err = tg.createISO8601DurationFromIntervalMS(autoInterval)
if err != nil {
return nil, err
}
}
params := url.Values{}
params.Add("api-version", "2018-01-01")
params.Add("timespan", fmt.Sprintf("%v/%v", startTime.UTC().Format(time.RFC3339), endTime.UTC().Format(time.RFC3339)))
params.Add("interval", timeGrain)
params.Add("aggregation", fmt.Sprintf("%v", azureMonitorTarget["aggregation"]))
params.Add("metricnames", fmt.Sprintf("%v", azureMonitorTarget["metricName"]))
dimension := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorTarget["dimension"]))
dimensionFilter := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorTarget["dimensionFilter"]))
if azureMonitorTarget["dimension"] != nil && azureMonitorTarget["dimensionFilter"] != nil && len(dimension) > 0 && len(dimensionFilter) > 0 {
params.Add("$filter", fmt.Sprintf("%s eq '%s'", dimension, dimensionFilter))
}
target = params.Encode()
if setting.Env == setting.DEV {
azlog.Debug("Azuremonitor request", "params", params)
}
azureMonitorQueries = append(azureMonitorQueries, &AzureMonitorQuery{
URL: azureURL,
UrlComponents: urlComponents,
Target: target,
Params: params,
RefID: query.RefId,
Alias: alias,
})
}
return azureMonitorQueries, nil
}
func (e *AzureMonitorDatasource) executeQuery(ctx context.Context, query *AzureMonitorQuery, queries []*tsdb.Query, timeRange *tsdb.TimeRange) (*tsdb.QueryResult, AzureMonitorResponse, error) {
queryResult := &tsdb.QueryResult{Meta: simplejson.New(), RefId: query.RefID}
req, err := e.createRequest(ctx, e.dsInfo)
if err != nil {
queryResult.Error = err
return queryResult, AzureMonitorResponse{}, nil
}
req.URL.Path = path.Join(req.URL.Path, query.URL)
req.URL.RawQuery = query.Params.Encode()
queryResult.Meta.Set("rawQuery", req.URL.RawQuery)
span, ctx := opentracing.StartSpanFromContext(ctx, "azuremonitor query")
span.SetTag("target", query.Target)
span.SetTag("from", timeRange.From)
span.SetTag("until", timeRange.To)
span.SetTag("datasource_id", e.dsInfo.Id)
span.SetTag("org_id", e.dsInfo.OrgId)
defer span.Finish()
opentracing.GlobalTracer().Inject(
span.Context(),
opentracing.HTTPHeaders,
opentracing.HTTPHeadersCarrier(req.Header))
azlog.Debug("AzureMonitor", "Request URL", req.URL.String())
res, err := ctxhttp.Do(ctx, e.httpClient, req)
if err != nil {
queryResult.Error = err
return queryResult, AzureMonitorResponse{}, nil
}
data, err := e.unmarshalResponse(res)
if err != nil {
queryResult.Error = err
return queryResult, AzureMonitorResponse{}, nil
}
return queryResult, data, nil
}
func (e *AzureMonitorDatasource) createRequest(ctx context.Context, dsInfo *models.DataSource) (*http.Request, error) {
// find plugin
plugin, ok := plugins.DataSources[dsInfo.Type]
if !ok {
return nil, errors.New("Unable to find datasource plugin Azure Monitor")
}
var azureMonitorRoute *plugins.AppPluginRoute
for _, route := range plugin.Routes {
if route.Path == "azuremonitor" {
azureMonitorRoute = route
break
}
}
cloudName := dsInfo.JsonData.Get("cloudName").MustString("azuremonitor")
subscriptionID := dsInfo.JsonData.Get("subscriptionId").MustString()
proxyPass := fmt.Sprintf("%s/subscriptions/%s", cloudName, subscriptionID)
u, _ := url.Parse(dsInfo.Url)
u.Path = path.Join(u.Path, "render")
req, err := http.NewRequest(http.MethodGet, u.String(), nil)
if err != nil {
azlog.Error("Failed to create request", "error", err)
return nil, fmt.Errorf("Failed to create request. error: %v", err)
}
req.Header.Set("Content-Type", "application/json")
req.Header.Set("User-Agent", fmt.Sprintf("Grafana/%s", setting.BuildVersion))
pluginproxy.ApplyRoute(ctx, req, proxyPass, azureMonitorRoute, dsInfo)
return req, nil
}
func (e *AzureMonitorDatasource) unmarshalResponse(res *http.Response) (AzureMonitorResponse, error) {
body, err := ioutil.ReadAll(res.Body)
defer res.Body.Close()
if err != nil {
return AzureMonitorResponse{}, err
}
if res.StatusCode/100 != 2 {
azlog.Error("Request failed", "status", res.Status, "body", string(body))
return AzureMonitorResponse{}, fmt.Errorf(string(body))
}
var data AzureMonitorResponse
err = json.Unmarshal(body, &data)
if err != nil {
azlog.Error("Failed to unmarshal AzureMonitor response", "error", err, "status", res.Status, "body", string(body))
return AzureMonitorResponse{}, err
}
return data, nil
}
func (e *AzureMonitorDatasource) parseResponse(queryRes *tsdb.QueryResult, data AzureMonitorResponse, query *AzureMonitorQuery) error {
if len(data.Value) == 0 {
return nil
}
for _, series := range data.Value[0].Timeseries {
points := []tsdb.TimePoint{}
metadataName := ""
metadataValue := ""
if len(series.Metadatavalues) > 0 {
metadataName = series.Metadatavalues[0].Name.LocalizedValue
metadataValue = series.Metadatavalues[0].Value
}
defaultMetricName := formatLegendKey(query.UrlComponents["resourceName"], data.Value[0].Name.LocalizedValue, metadataName, metadataValue)
for _, point := range series.Data {
var value float64
switch query.Params.Get("aggregation") {
case "Average":
value = point.Average
case "Total":
value = point.Total
case "Maximum":
value = point.Maximum
case "Minimum":
value = point.Minimum
case "Count":
value = point.Count
default:
value = point.Count
}
points = append(points, tsdb.NewTimePoint(null.FloatFrom(value), float64((point.TimeStamp).Unix())*1000))
}
queryRes.Series = append(queryRes.Series, &tsdb.TimeSeries{
Name: defaultMetricName,
Points: points,
})
}
return nil
}
// findClosestAllowedIntervalMs is used for the auto time grain setting.
// It finds the closest time grain from the list of allowed time grains for Azure Monitor
// using the Grafana interval in milliseconds
func (e *AzureMonitorDatasource) findClosestAllowedIntervalMS(intervalMs int64) int64 {
closest := allowedIntervalsMS[0]
for i, allowed := range allowedIntervalsMS {
if intervalMs > allowed {
if i+1 < len(allowedIntervalsMS) {
closest = allowedIntervalsMS[i+1]
} else {
closest = allowed
}
}
}
return closest
}
// formatLegendKey builds the legend key or timeseries name
func formatLegendKey(resourceName string, metricName string, metadataName string, metadataValue string) string {
if len(metadataName) > 0 {
return fmt.Sprintf("%s{%s=%s}.%s", resourceName, metadataName, metadataValue, metricName)
}
return fmt.Sprintf("%s.%s", resourceName, metricName)
}

View File

@ -0,0 +1,264 @@
package azuremonitor
import (
"encoding/json"
"fmt"
"io/ioutil"
"net/url"
"testing"
"time"
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/tsdb"
. "github.com/smartystreets/goconvey/convey"
)
func TestAzureMonitorDatasource(t *testing.T) {
Convey("AzureMonitorDatasource", t, func() {
datasource := &AzureMonitorDatasource{}
Convey("Parse queries from frontend and build AzureMonitor API queries", func() {
fromStart := time.Date(2018, 3, 15, 13, 0, 0, 0, time.UTC).In(time.Local)
tsdbQuery := &tsdb.TsdbQuery{
TimeRange: &tsdb.TimeRange{
From: fmt.Sprintf("%v", fromStart.Unix()*1000),
To: fmt.Sprintf("%v", fromStart.Add(34*time.Minute).Unix()*1000),
},
Queries: []*tsdb.Query{
{
Model: simplejson.NewFromAny(map[string]interface{}{
"azureMonitor": map[string]interface{}{
"timeGrain": "PT1M",
"aggregation": "Average",
"resourceGroup": "grafanastaging",
"resourceName": "grafana",
"metricDefinition": "Microsoft.Compute/virtualMachines",
"metricName": "Percentage CPU",
"alias": "testalias",
"queryType": "Azure Monitor",
},
}),
RefId: "A",
},
},
}
Convey("and is a normal query", func() {
queries, err := datasource.buildQueries(tsdbQuery.Queries, tsdbQuery.TimeRange)
So(err, ShouldBeNil)
So(len(queries), ShouldEqual, 1)
So(queries[0].RefID, ShouldEqual, "A")
So(queries[0].URL, ShouldEqual, "resourceGroups/grafanastaging/providers/Microsoft.Compute/virtualMachines/grafana/providers/microsoft.insights/metrics")
So(queries[0].Target, ShouldEqual, "aggregation=Average&api-version=2018-01-01&interval=PT1M&metricnames=Percentage+CPU&timespan=2018-03-15T13%3A00%3A00Z%2F2018-03-15T13%3A34%3A00Z")
So(len(queries[0].Params), ShouldEqual, 5)
So(queries[0].Params["timespan"][0], ShouldEqual, "2018-03-15T13:00:00Z/2018-03-15T13:34:00Z")
So(queries[0].Params["api-version"][0], ShouldEqual, "2018-01-01")
So(queries[0].Params["aggregation"][0], ShouldEqual, "Average")
So(queries[0].Params["metricnames"][0], ShouldEqual, "Percentage CPU")
So(queries[0].Params["interval"][0], ShouldEqual, "PT1M")
So(queries[0].Alias, ShouldEqual, "testalias")
})
Convey("and has a dimension filter", func() {
tsdbQuery.Queries[0].Model = simplejson.NewFromAny(map[string]interface{}{
"azureMonitor": map[string]interface{}{
"timeGrain": "PT1M",
"aggregation": "Average",
"resourceGroup": "grafanastaging",
"resourceName": "grafana",
"metricDefinition": "Microsoft.Compute/virtualMachines",
"metricName": "Percentage CPU",
"alias": "testalias",
"queryType": "Azure Monitor",
"dimension": "blob",
"dimensionFilter": "*",
},
})
queries, err := datasource.buildQueries(tsdbQuery.Queries, tsdbQuery.TimeRange)
So(err, ShouldBeNil)
So(queries[0].Target, ShouldEqual, "%24filter=blob+eq+%27%2A%27&aggregation=Average&api-version=2018-01-01&interval=PT1M&metricnames=Percentage+CPU&timespan=2018-03-15T13%3A00%3A00Z%2F2018-03-15T13%3A34%3A00Z")
})
})
Convey("Parse AzureMonitor API response in the time series format", func() {
Convey("when data from query aggregated as average to one time series", func() {
data, err := loadTestFile("./test-data/1-azure-monitor-response-avg.json")
So(err, ShouldBeNil)
So(data.Interval, ShouldEqual, "PT1M")
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Average"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(len(res.Series), ShouldEqual, 1)
So(res.Series[0].Name, ShouldEqual, "grafana.Percentage CPU")
So(len(res.Series[0].Points), ShouldEqual, 5)
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 2.0875)
So(res.Series[0].Points[0][1].Float64, ShouldEqual, 1549620780000)
So(res.Series[0].Points[1][0].Float64, ShouldEqual, 2.1525)
So(res.Series[0].Points[1][1].Float64, ShouldEqual, 1549620840000)
So(res.Series[0].Points[2][0].Float64, ShouldEqual, 2.155)
So(res.Series[0].Points[2][1].Float64, ShouldEqual, 1549620900000)
So(res.Series[0].Points[3][0].Float64, ShouldEqual, 3.6925)
So(res.Series[0].Points[3][1].Float64, ShouldEqual, 1549620960000)
So(res.Series[0].Points[4][0].Float64, ShouldEqual, 2.44)
So(res.Series[0].Points[4][1].Float64, ShouldEqual, 1549621020000)
})
Convey("when data from query aggregated as total to one time series", func() {
data, err := loadTestFile("./test-data/2-azure-monitor-response-total.json")
So(err, ShouldBeNil)
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Total"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 8.26)
So(res.Series[0].Points[0][1].Float64, ShouldEqual, 1549718940000)
})
Convey("when data from query aggregated as maximum to one time series", func() {
data, err := loadTestFile("./test-data/3-azure-monitor-response-maximum.json")
So(err, ShouldBeNil)
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Maximum"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 3.07)
So(res.Series[0].Points[0][1].Float64, ShouldEqual, 1549722360000)
})
Convey("when data from query aggregated as minimum to one time series", func() {
data, err := loadTestFile("./test-data/4-azure-monitor-response-minimum.json")
So(err, ShouldBeNil)
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Minimum"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 1.51)
So(res.Series[0].Points[0][1].Float64, ShouldEqual, 1549723380000)
})
Convey("when data from query aggregated as Count to one time series", func() {
data, err := loadTestFile("./test-data/5-azure-monitor-response-count.json")
So(err, ShouldBeNil)
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Count"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 4)
So(res.Series[0].Points[0][1].Float64, ShouldEqual, 1549723440000)
})
Convey("when data from query aggregated as total and has dimension filter", func() {
data, err := loadTestFile("./test-data/6-azure-monitor-response-multi-dimension.json")
So(err, ShouldBeNil)
res := &tsdb.QueryResult{Meta: simplejson.New(), RefId: "A"}
query := &AzureMonitorQuery{
UrlComponents: map[string]string{
"resourceName": "grafana",
},
Params: url.Values{
"aggregation": {"Average"},
},
}
err = datasource.parseResponse(res, data, query)
So(err, ShouldBeNil)
So(len(res.Series), ShouldEqual, 3)
So(res.Series[0].Name, ShouldEqual, "grafana{blobtype=PageBlob}.Blob Count")
So(res.Series[0].Points[0][0].Float64, ShouldEqual, 3)
So(res.Series[1].Name, ShouldEqual, "grafana{blobtype=BlockBlob}.Blob Count")
So(res.Series[1].Points[0][0].Float64, ShouldEqual, 1)
So(res.Series[2].Name, ShouldEqual, "grafana{blobtype=Azure Data Lake Storage}.Blob Count")
So(res.Series[2].Points[0][0].Float64, ShouldEqual, 0)
})
})
Convey("Find closest allowed interval for auto time grain", func() {
intervals := map[string]int64{
"3m": 180000,
"5m": 300000,
"10m": 600000,
"15m": 900000,
"1d": 86400000,
"2d": 172800000,
}
closest := datasource.findClosestAllowedIntervalMS(intervals["3m"])
So(closest, ShouldEqual, intervals["5m"])
closest = datasource.findClosestAllowedIntervalMS(intervals["10m"])
So(closest, ShouldEqual, intervals["15m"])
closest = datasource.findClosestAllowedIntervalMS(intervals["2d"])
So(closest, ShouldEqual, intervals["1d"])
})
})
}
func loadTestFile(path string) (AzureMonitorResponse, error) {
var data AzureMonitorResponse
jsonBody, err := ioutil.ReadFile(path)
if err != nil {
return data, err
}
err = json.Unmarshal(jsonBody, &data)
return data, err
}

View File

@ -0,0 +1,70 @@
package azuremonitor
import (
"context"
"fmt"
"net/http"
"github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/tsdb"
)
var (
azlog log.Logger
)
// AzureMonitorExecutor executes queries for the Azure Monitor datasource - all four services
type AzureMonitorExecutor struct {
httpClient *http.Client
dsInfo *models.DataSource
}
// NewAzureMonitorExecutor initializes a http client
func NewAzureMonitorExecutor(dsInfo *models.DataSource) (tsdb.TsdbQueryEndpoint, error) {
httpClient, err := dsInfo.GetHttpClient()
if err != nil {
return nil, err
}
return &AzureMonitorExecutor{
httpClient: httpClient,
dsInfo: dsInfo,
}, nil
}
func init() {
azlog = log.New("tsdb.azuremonitor")
tsdb.RegisterTsdbQueryEndpoint("grafana-azure-monitor-datasource", NewAzureMonitorExecutor)
}
// Query takes in the frontend queries, parses them into the query format
// expected by chosen Azure Monitor service (Azure Monitor, App Insights etc.)
// executes the queries against the API and parses the response into
// the right format
func (e *AzureMonitorExecutor) Query(ctx context.Context, dsInfo *models.DataSource, tsdbQuery *tsdb.TsdbQuery) (*tsdb.Response, error) {
var result *tsdb.Response
var err error
var azureMonitorQueries []*tsdb.Query
for _, query := range tsdbQuery.Queries {
queryType := query.Model.Get("queryType").MustString("")
switch queryType {
case "Azure Monitor":
azureMonitorQueries = append(azureMonitorQueries, query)
default:
return nil, fmt.Errorf("Alerting not supported for %s", queryType)
}
}
azDatasource := &AzureMonitorDatasource{
httpClient: e.httpClient,
dsInfo: e.dsInfo,
}
result, err = azDatasource.executeTimeSeriesQuery(ctx, azureMonitorQueries, tsdbQuery.TimeRange)
return result, err
}

View File

@ -0,0 +1,47 @@
{
"cost": 0,
"timespan": "2019-02-08T10:13:50Z\/2019-02-08T16:13:50Z",
"interval": "PT1M",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Compute\/virtualMachines\/grafana\/providers\/Microsoft.Insights\/metrics\/Percentage CPU",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "Percentage CPU",
"localizedValue": "Percentage CPU"
},
"unit": "Percent",
"timeseries": [
{
"metadatavalues": [
],
"data": [
{
"timeStamp": "2019-02-08T10:13:00Z",
"average": 2.0875
},
{
"timeStamp": "2019-02-08T10:14:00Z",
"average": 2.1525
},
{
"timeStamp": "2019-02-08T10:15:00Z",
"average": 2.155
},
{
"timeStamp": "2019-02-08T10:16:00Z",
"average": 3.6925
},
{
"timeStamp": "2019-02-08T10:17:00Z",
"average": 2.44
}
]
}
]
}
],
"namespace": "Microsoft.Compute\/virtualMachines",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,47 @@
{
"cost": 0,
"timespan": "2019-02-09T13:29:41Z\/2019-02-09T19:29:41Z",
"interval": "PT1M",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Compute\/virtualMachines\/grafana\/providers\/Microsoft.Insights\/metrics\/Percentage CPU",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "Percentage CPU",
"localizedValue": "Percentage CPU"
},
"unit": "Percent",
"timeseries": [
{
"metadatavalues": [
],
"data": [
{
"timeStamp": "2019-02-09T13:29:00Z",
"total": 8.26
},
{
"timeStamp": "2019-02-09T13:30:00Z",
"total": 8.7
},
{
"timeStamp": "2019-02-09T13:31:00Z",
"total": 14.82
},
{
"timeStamp": "2019-02-09T13:32:00Z",
"total": 10.07
},
{
"timeStamp": "2019-02-09T13:33:00Z",
"total": 8.52
}
]
}
]
}
],
"namespace": "Microsoft.Compute\/virtualMachines",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,47 @@
{
"cost": 0,
"timespan": "2019-02-09T14:26:12Z\/2019-02-09T20:26:12Z",
"interval": "PT1M",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Compute\/virtualMachines\/grafana\/providers\/Microsoft.Insights\/metrics\/Percentage CPU",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "Percentage CPU",
"localizedValue": "Percentage CPU"
},
"unit": "Percent",
"timeseries": [
{
"metadatavalues": [
],
"data": [
{
"timeStamp": "2019-02-09T14:26:00Z",
"maximum": 3.07
},
{
"timeStamp": "2019-02-09T14:27:00Z",
"maximum": 2.92
},
{
"timeStamp": "2019-02-09T14:28:00Z",
"maximum": 2.87
},
{
"timeStamp": "2019-02-09T14:29:00Z",
"maximum": 2.27
},
{
"timeStamp": "2019-02-09T14:30:00Z",
"maximum": 2.52
}
]
}
]
}
],
"namespace": "Microsoft.Compute\/virtualMachines",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,47 @@
{
"cost": 0,
"timespan": "2019-02-09T14:43:21Z\/2019-02-09T20:43:21Z",
"interval": "PT1M",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Compute\/virtualMachines\/grafana\/providers\/Microsoft.Insights\/metrics\/Percentage CPU",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "Percentage CPU",
"localizedValue": "Percentage CPU"
},
"unit": "Percent",
"timeseries": [
{
"metadatavalues": [
],
"data": [
{
"timeStamp": "2019-02-09T14:43:00Z",
"minimum": 1.51
},
{
"timeStamp": "2019-02-09T14:44:00Z",
"minimum": 2.38
},
{
"timeStamp": "2019-02-09T14:45:00Z",
"minimum": 1.69
},
{
"timeStamp": "2019-02-09T14:46:00Z",
"minimum": 2.27
},
{
"timeStamp": "2019-02-09T14:47:00Z",
"minimum": 1.96
}
]
}
]
}
],
"namespace": "Microsoft.Compute\/virtualMachines",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,47 @@
{
"cost": 0,
"timespan": "2019-02-09T14:44:52Z\/2019-02-09T20:44:52Z",
"interval": "PT1M",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Compute\/virtualMachines\/grafana\/providers\/Microsoft.Insights\/metrics\/Percentage CPU",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "Percentage CPU",
"localizedValue": "Percentage CPU"
},
"unit": "Percent",
"timeseries": [
{
"metadatavalues": [
],
"data": [
{
"timeStamp": "2019-02-09T14:44:00Z",
"count": 4
},
{
"timeStamp": "2019-02-09T14:45:00Z",
"count": 4
},
{
"timeStamp": "2019-02-09T14:46:00Z",
"count": 4
},
{
"timeStamp": "2019-02-09T14:47:00Z",
"count": 4
},
{
"timeStamp": "2019-02-09T14:48:00Z",
"count": 4
}
]
}
]
}
],
"namespace": "Microsoft.Compute\/virtualMachines",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,128 @@
{
"cost": 0,
"timespan": "2019-02-09T15:21:39Z\/2019-02-09T21:21:39Z",
"interval": "PT1H",
"value": [
{
"id": "\/subscriptions\/xxx\/resourceGroups\/grafanastaging\/providers\/Microsoft.Storage\/storageAccounts\/grafanastaging\/blobServices\/default\/providers\/Microsoft.Insights\/metrics\/BlobCount",
"type": "Microsoft.Insights\/metrics",
"name": {
"value": "BlobCount",
"localizedValue": "Blob Count"
},
"unit": "Count",
"timeseries": [
{
"metadatavalues": [
{
"name": {
"value": "blobtype",
"localizedValue": "blobtype"
},
"value": "PageBlob"
}
],
"data": [
{
"timeStamp": "2019-02-09T15:21:00Z",
"average": 3
},
{
"timeStamp": "2019-02-09T16:21:00Z",
"average": 3
},
{
"timeStamp": "2019-02-09T17:21:00Z",
"average": 3
},
{
"timeStamp": "2019-02-09T18:21:00Z",
"average": 3
},
{
"timeStamp": "2019-02-09T19:21:00Z",
"average": 3
},
{
"timeStamp": "2019-02-09T20:21:00Z"
}
]
},
{
"metadatavalues": [
{
"name": {
"value": "blobtype",
"localizedValue": "blobtype"
},
"value": "BlockBlob"
}
],
"data": [
{
"timeStamp": "2019-02-09T15:21:00Z",
"average": 1
},
{
"timeStamp": "2019-02-09T16:21:00Z",
"average": 1
},
{
"timeStamp": "2019-02-09T17:21:00Z",
"average": 1
},
{
"timeStamp": "2019-02-09T18:21:00Z",
"average": 1
},
{
"timeStamp": "2019-02-09T19:21:00Z",
"average": 1
},
{
"timeStamp": "2019-02-09T20:21:00Z"
}
]
},
{
"metadatavalues": [
{
"name": {
"value": "blobtype",
"localizedValue": "blobtype"
},
"value": "Azure Data Lake Storage"
}
],
"data": [
{
"timeStamp": "2019-02-09T15:21:00Z",
"average": 0
},
{
"timeStamp": "2019-02-09T16:21:00Z",
"average": 0
},
{
"timeStamp": "2019-02-09T17:21:00Z",
"average": 0
},
{
"timeStamp": "2019-02-09T18:21:00Z",
"average": 0
},
{
"timeStamp": "2019-02-09T19:21:00Z",
"average": 0
},
{
"timeStamp": "2019-02-09T20:21:00Z"
}
]
}
]
}
],
"namespace": "Microsoft.Storage\/storageAccounts\/blobServices",
"resourceregion": "westeurope"
}

View File

@ -0,0 +1,52 @@
package azuremonitor
import (
"fmt"
"strconv"
"strings"
"time"
"github.com/grafana/grafana/pkg/tsdb"
)
// TimeGrain handles convertions between
// the ISO 8601 Duration format (PT1H), Kbn units (1h) and Time Grains (1 hour)
// Also handles using the automatic Grafana interval to calculate a ISO 8601 Duration.
type TimeGrain struct{}
var (
smallTimeUnits = []string{"hour", "minute", "h", "m"}
)
func (tg *TimeGrain) createISO8601DurationFromIntervalMS(interval int64) (string, error) {
formatted := tsdb.FormatDuration(time.Duration(interval) * time.Millisecond)
if strings.Contains(formatted, "ms") {
return "PT1M", nil
}
timeValueString := formatted[0 : len(formatted)-1]
timeValue, err := strconv.Atoi(timeValueString)
if err != nil {
return "", fmt.Errorf("Could not parse interval %v to an ISO 8061 duration", interval)
}
unit := formatted[len(formatted)-1:]
if unit == "s" && timeValue < 60 {
// minimum interval is 1m for Azure Monitor
return "PT1M", nil
}
return tg.createISO8601Duration(timeValue, unit), nil
}
func (tg *TimeGrain) createISO8601Duration(timeValue int, timeUnit string) string {
for _, smallTimeUnit := range smallTimeUnits {
if timeUnit == smallTimeUnit {
return fmt.Sprintf("PT%v%v", timeValue, strings.ToUpper(timeUnit[0:1]))
}
}
return fmt.Sprintf("P%v%v", timeValue, strings.ToUpper(timeUnit[0:1]))
}

View File

@ -0,0 +1,71 @@
package azuremonitor
import (
"testing"
. "github.com/smartystreets/goconvey/convey"
)
func TestTimeGrain(t *testing.T) {
Convey("TimeGrain", t, func() {
tgc := &TimeGrain{}
Convey("create ISO 8601 Duration", func() {
Convey("when given a time unit smaller than a day", func() {
minuteKbnDuration := tgc.createISO8601Duration(1, "m")
hourKbnDuration := tgc.createISO8601Duration(2, "h")
minuteDuration := tgc.createISO8601Duration(1, "minute")
hourDuration := tgc.createISO8601Duration(2, "hour")
Convey("should convert it to a time duration", func() {
So(minuteKbnDuration, ShouldEqual, "PT1M")
So(hourKbnDuration, ShouldEqual, "PT2H")
So(minuteDuration, ShouldEqual, "PT1M")
So(hourDuration, ShouldEqual, "PT2H")
})
})
Convey("when given the day time unit", func() {
kbnDuration := tgc.createISO8601Duration(1, "d")
duration := tgc.createISO8601Duration(2, "day")
Convey("should convert it to a date duration", func() {
So(kbnDuration, ShouldEqual, "P1D")
So(duration, ShouldEqual, "P2D")
})
})
})
Convey("create ISO 8601 Duration from Grafana interval in milliseconds", func() {
Convey("and interval is less than a minute", func() {
durationMS, err := tgc.createISO8601DurationFromIntervalMS(100)
So(err, ShouldBeNil)
durationS, err := tgc.createISO8601DurationFromIntervalMS(59999)
So(err, ShouldBeNil)
Convey("should be rounded up to a minute as is the minimum interval for Azure Monitor", func() {
So(durationMS, ShouldEqual, "PT1M")
So(durationS, ShouldEqual, "PT1M")
})
})
Convey("and interval is more than a minute", func() {
intervals := map[string]int64{
"10m": 600000,
"2d": 172800000,
}
durationM, err := tgc.createISO8601DurationFromIntervalMS(intervals["10m"])
So(err, ShouldBeNil)
durationD, err := tgc.createISO8601DurationFromIntervalMS(intervals["2d"])
So(err, ShouldBeNil)
Convey("should be rounded up to a minute as is the minimum interval for Azure Monitor", func() {
So(durationM, ShouldEqual, "PT10M")
So(durationD, ShouldEqual, "P2D")
})
})
})
})
}

View File

@ -0,0 +1,77 @@
package azuremonitor
import (
"net/url"
"time"
)
// AzureMonitorQuery is the query for all the services as they have similar queries
// with a url, a querystring and an alias field
type AzureMonitorQuery struct {
URL string
UrlComponents map[string]string
Target string
Params url.Values
RefID string
Alias string
}
// AzureMonitorResponse is the json response from the Azure Monitor API
type AzureMonitorResponse struct {
Cost int `json:"cost"`
Timespan string `json:"timespan"`
Interval string `json:"interval"`
Value []struct {
ID string `json:"id"`
Type string `json:"type"`
Name struct {
Value string `json:"value"`
LocalizedValue string `json:"localizedValue"`
} `json:"name"`
Unit string `json:"unit"`
Timeseries []struct {
Metadatavalues []struct {
Name struct {
Value string `json:"value"`
LocalizedValue string `json:"localizedValue"`
} `json:"name"`
Value string `json:"value"`
} `json:"metadatavalues"`
Data []struct {
TimeStamp time.Time `json:"timeStamp"`
Average float64 `json:"average,omitempty"`
Total float64 `json:"total,omitempty"`
Count float64 `json:"count,omitempty"`
Maximum float64 `json:"maximum,omitempty"`
Minimum float64 `json:"minimum,omitempty"`
} `json:"data"`
} `json:"timeseries"`
} `json:"value"`
Namespace string `json:"namespace"`
Resourceregion string `json:"resourceregion"`
}
// ApplicationInsightsResponse is the json response from the Application Insights API
type ApplicationInsightsResponse struct {
Tables []struct {
TableName string `json:"TableName"`
Columns []struct {
ColumnName string `json:"ColumnName"`
DataType string `json:"DataType"`
ColumnType string `json:"ColumnType"`
} `json:"Columns"`
Rows [][]interface{} `json:"Rows"`
} `json:"Tables"`
}
// AzureLogAnalyticsResponse is the json response object from the Azure Log Analytics API.
type AzureLogAnalyticsResponse struct {
Tables []struct {
Name string `json:"name"`
Columns []struct {
Name string `json:"name"`
Type string `json:"type"`
} `json:"columns"`
Rows [][]interface{} `json:"rows"`
} `json:"tables"`
}

View File

@ -0,0 +1,28 @@
package azuremonitor
import (
"fmt"
"strings"
)
// urlBuilder builds the URL for calling the Azure Monitor API
type urlBuilder struct {
ResourceGroup string
MetricDefinition string
ResourceName string
}
// Build checks the metric definition property to see which form of the url
// should be returned
func (ub *urlBuilder) Build() string {
if strings.Count(ub.MetricDefinition, "/") > 1 {
rn := strings.Split(ub.ResourceName, "/")
lastIndex := strings.LastIndex(ub.MetricDefinition, "/")
service := ub.MetricDefinition[lastIndex+1:]
md := ub.MetricDefinition[0:lastIndex]
return fmt.Sprintf("resourceGroups/%s/providers/%s/%s/%s/%s/providers/microsoft.insights/metrics", ub.ResourceGroup, md, rn[0], service, rn[1])
}
return fmt.Sprintf("resourceGroups/%s/providers/%s/%s/providers/microsoft.insights/metrics", ub.ResourceGroup, ub.MetricDefinition, ub.ResourceName)
}

View File

@ -0,0 +1,45 @@
package azuremonitor
import (
"testing"
. "github.com/smartystreets/goconvey/convey"
)
func TestURLBuilder(t *testing.T) {
Convey("AzureMonitor URL Builder", t, func() {
Convey("when metric definition is in the short form", func() {
ub := &urlBuilder{
ResourceGroup: "rg",
MetricDefinition: "Microsoft.Compute/virtualMachines",
ResourceName: "rn",
}
url := ub.Build()
So(url, ShouldEqual, "resourceGroups/rg/providers/Microsoft.Compute/virtualMachines/rn/providers/microsoft.insights/metrics")
})
Convey("when metric definition is Microsoft.Storage/storageAccounts/blobServices", func() {
ub := &urlBuilder{
ResourceGroup: "rg",
MetricDefinition: "Microsoft.Storage/storageAccounts/blobServices",
ResourceName: "rn1/default",
}
url := ub.Build()
So(url, ShouldEqual, "resourceGroups/rg/providers/Microsoft.Storage/storageAccounts/rn1/blobServices/default/providers/microsoft.insights/metrics")
})
Convey("when metric definition is Microsoft.Storage/storageAccounts/fileServices", func() {
ub := &urlBuilder{
ResourceGroup: "rg",
MetricDefinition: "Microsoft.Storage/storageAccounts/fileServices",
ResourceName: "rn1/default",
}
url := ub.Build()
So(url, ShouldEqual, "resourceGroups/rg/providers/Microsoft.Storage/storageAccounts/rn1/fileServices/default/providers/microsoft.insights/metrics")
})
})
}

View File

@ -55,6 +55,7 @@ func init() {
"AWS/DynamoDB": {"ConditionalCheckFailedRequests", "ConsumedReadCapacityUnits", "ConsumedWriteCapacityUnits", "OnlineIndexConsumedWriteCapacity", "OnlineIndexPercentageProgress", "OnlineIndexThrottleEvents", "ProvisionedReadCapacityUnits", "ProvisionedWriteCapacityUnits", "ReadThrottleEvents", "ReturnedBytes", "ReturnedItemCount", "ReturnedRecordsCount", "SuccessfulRequestLatency", "SystemErrors", "TimeToLiveDeletedItemCount", "ThrottledRequests", "UserErrors", "WriteThrottleEvents"},
"AWS/EBS": {"VolumeReadBytes", "VolumeWriteBytes", "VolumeReadOps", "VolumeWriteOps", "VolumeTotalReadTime", "VolumeTotalWriteTime", "VolumeIdleTime", "VolumeQueueLength", "VolumeThroughputPercentage", "VolumeConsumedReadWriteOps", "BurstBalance"},
"AWS/EC2": {"CPUCreditUsage", "CPUCreditBalance", "CPUUtilization", "DiskReadOps", "DiskWriteOps", "DiskReadBytes", "DiskWriteBytes", "NetworkIn", "NetworkOut", "NetworkPacketsIn", "NetworkPacketsOut", "StatusCheckFailed", "StatusCheckFailed_Instance", "StatusCheckFailed_System"},
"AWS/EC2/API": {"ClientErrors", "RequestLimitExceeded", "ServerErrors", "SuccessfulCalls"},
"AWS/EC2Spot": {"AvailableInstancePoolsCount", "BidsSubmittedForCapacity", "EligibleInstancePoolCount", "FulfilledCapacity", "MaxPercentCapacityAllocation", "PendingCapacity", "PercentCapacityAllocation", "TargetCapacity", "TerminatingCapacity"},
"AWS/ECS": {"CPUReservation", "MemoryReservation", "CPUUtilization", "MemoryUtilization"},
"AWS/EFS": {"BurstCreditBalance", "ClientConnections", "DataReadIOBytes", "DataWriteIOBytes", "MetadataIOBytes", "TotalIOBytes", "PermittedThroughput", "PercentIOLimit"},
@ -133,6 +134,7 @@ func init() {
"AWS/DynamoDB": {"TableName", "GlobalSecondaryIndexName", "Operation", "StreamLabel"},
"AWS/EBS": {"VolumeId"},
"AWS/EC2": {"AutoScalingGroupName", "ImageId", "InstanceId", "InstanceType"},
"AWS/EC2/API": {},
"AWS/EC2Spot": {"AvailabilityZone", "FleetRequestId", "InstanceType"},
"AWS/ECS": {"ClusterName", "ServiceName"},
"AWS/EFS": {"FileSystemId"},

View File

@ -59,11 +59,11 @@ func (ic *intervalCalculator) Calculate(timerange *TimeRange, minInterval time.D
interval := time.Duration((to - from) / defaultRes)
if interval < minInterval {
return Interval{Text: formatDuration(minInterval), Value: minInterval}
return Interval{Text: FormatDuration(minInterval), Value: minInterval}
}
rounded := roundInterval(interval)
return Interval{Text: formatDuration(rounded), Value: rounded}
return Interval{Text: FormatDuration(rounded), Value: rounded}
}
func GetIntervalFrom(dsInfo *models.DataSource, queryModel *simplejson.Json, defaultInterval time.Duration) (time.Duration, error) {
@ -89,7 +89,8 @@ func GetIntervalFrom(dsInfo *models.DataSource, queryModel *simplejson.Json, def
return parsedInterval, nil
}
func formatDuration(inter time.Duration) string {
// FormatDuration converts a duration into the kbn format e.g. 1m 2h or 3d
func FormatDuration(inter time.Duration) string {
if inter >= year {
return fmt.Sprintf("%dy", inter/year)
}

View File

@ -51,11 +51,11 @@ func TestInterval(t *testing.T) {
})
Convey("Format value", func() {
So(formatDuration(time.Second*61), ShouldEqual, "1m")
So(formatDuration(time.Millisecond*30), ShouldEqual, "30ms")
So(formatDuration(time.Hour*23), ShouldEqual, "23h")
So(formatDuration(time.Hour*24), ShouldEqual, "1d")
So(formatDuration(time.Hour*24*367), ShouldEqual, "1y")
So(FormatDuration(time.Second*61), ShouldEqual, "1m")
So(FormatDuration(time.Millisecond*30), ShouldEqual, "30ms")
So(FormatDuration(time.Hour*23), ShouldEqual, "23h")
So(FormatDuration(time.Hour*24), ShouldEqual, "1d")
So(FormatDuration(time.Hour*24*367), ShouldEqual, "1y")
})
})
}

View File

@ -32,6 +32,18 @@ func newMysqlQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndpoin
datasource.Url,
datasource.Database,
)
tlsConfig, err := datasource.GetTLSConfig()
if err != nil {
return nil, err
}
if tlsConfig.RootCAs != nil || len(tlsConfig.Certificates) > 0 {
tlsConfigString := fmt.Sprintf("ds%d", datasource.Id)
mysql.RegisterTLSConfig(tlsConfigString, tlsConfig)
cnnstr += "&tls=" + tlsConfigString
}
logger.Debug("getEngine", "connection", cnnstr)
config := tsdb.SqlQueryEndpointConfiguration{

View File

@ -1,11 +1,12 @@
import React, { FC } from 'react';
import Transition from 'react-transition-group/Transition';
import Transition, { ExitHandler } from 'react-transition-group/Transition';
interface Props {
duration: number;
children: JSX.Element;
in: boolean;
unmountOnExit?: boolean;
onExited?: ExitHandler;
}
export const FadeIn: FC<Props> = props => {
@ -22,7 +23,12 @@ export const FadeIn: FC<Props> = props => {
};
return (
<Transition in={props.in} timeout={props.duration} unmountOnExit={props.unmountOnExit || false}>
<Transition
in={props.in}
timeout={props.duration}
unmountOnExit={props.unmountOnExit || false}
onExited={props.onExited}
>
{state => (
<div
style={{

View File

@ -20,7 +20,7 @@ class EmptyListCTA extends Component<Props, any> {
return (
<div className="empty-list-cta">
<div className="empty-list-cta__title">{title}</div>
<a onClick={onClick} href={buttonLink} className="empty-list-cta__button btn btn-xlarge btn-success">
<a onClick={onClick} href={buttonLink} className="empty-list-cta__button btn btn-xlarge btn-primary">
<i className={buttonIcon} />
{buttonTitle}
</a>

View File

@ -10,7 +10,7 @@ exports[`EmptyListCTA renders correctly 1`] = `
Title
</div>
<a
className="empty-list-cta__button btn btn-xlarge btn-success"
className="empty-list-cta__button btn btn-xlarge btn-primary"
href="http://url/to/destination"
onClick={[MockFunction]}
>

View File

@ -35,7 +35,7 @@ export default class OrgActionBar extends PureComponent<Props> {
<LayoutSelector mode={layoutMode} onLayoutModeChanged={(mode: LayoutMode) => onSetLayoutMode(mode)} />
</div>
<div className="page-action-bar__spacer" />
<a className="btn btn-success" {...linkProps}>
<a className="btn btn-primary" {...linkProps}>
{linkButton.title}
</a>
</div>

View File

@ -29,7 +29,7 @@ exports[`Render should render component 1`] = `
className="page-action-bar__spacer"
/>
<a
className="btn btn-success"
className="btn btn-primary"
href="some/url"
target="_blank"
>

View File

@ -130,7 +130,7 @@ class AddPermissions extends Component<Props, NewDashboardAclItem> {
</div>
<div className="gf-form">
<button data-save-permission className="btn btn-success" type="submit" disabled={!isValid}>
<button data-save-permission className="btn btn-primary" type="submit" disabled={!isValid}>
Save
</button>
</div>

View File

@ -126,7 +126,7 @@ export class SharedPreferences extends PureComponent<Props, State> {
/>
</div>
<div className="gf-form-button-row">
<button type="submit" className="btn btn-success">
<button type="submit" className="btn btn-primary">
Save
</button>
</div>

View File

@ -5,16 +5,13 @@
<i class="gf-form-input-icon fa fa-search"></i>
</label>
<div class="page-action-bar__spacer"></div>
<a class="btn btn-success" ng-href="{{ctrl.createDashboardUrl()}}" ng-if="ctrl.hasEditPermissionInFolders || ctrl.canSave">
<i class="fa fa-plus"></i>
Dashboard
<a class="btn btn-primary" ng-href="{{ctrl.createDashboardUrl()}}" ng-if="ctrl.hasEditPermissionInFolders || ctrl.canSave">
New Dashboard
</a>
<a class="btn btn-success" href="dashboards/folder/new" ng-if="!ctrl.folderId && ctrl.isEditor">
<i class="fa fa-plus"></i>
Folder
<a class="btn btn-inverse" href="dashboards/folder/new" ng-if="!ctrl.folderId && ctrl.isEditor">
New Folder
</a>
<a class="btn btn-success" href="{{ctrl.importDashboardUrl()}}" ng-if="ctrl.hasEditPermissionInFolders || ctrl.canSave">
<i class="fa fa-plus"></i>
<a class="btn btn-inverse" href="{{ctrl.importDashboardUrl()}}" ng-if="ctrl.hasEditPermissionInFolders || ctrl.canSave">
Import
</a>
</div>

View File

@ -8,6 +8,16 @@ jest.mock('../../app_events', () => ({
emit: jest.fn(),
}));
jest.mock('app/store/store', () => ({
store: {
getState: jest.fn().mockReturnValue({
location: {
lastUpdated: 0,
},
}),
},
}));
jest.mock('app/core/services/context_srv', () => ({
contextSrv: {
sidemenu: true,

View File

@ -3,9 +3,16 @@ import appEvents from '../../app_events';
import { contextSrv } from 'app/core/services/context_srv';
import TopSection from './TopSection';
import BottomSection from './BottomSection';
import { store } from 'app/store/store';
export class SideMenu extends PureComponent {
toggleSideMenu = () => {
// ignore if we just made a location change, stops hiding sidemenu on double clicks of back button
const timeSinceLocationChanged = new Date().getTime() - store.getState().location.lastUpdated;
if (timeSinceLocationChanged < 1000) {
return;
}
contextSrv.toggleSideMenu();
appEvents.emit('toggle-sidemenu');
};

View File

@ -1,4 +1,3 @@
import './directives/dash_class';
import './directives/dropdown_typeahead';
import './directives/autofill_event_fix';
import './directives/metric_segment';

View File

@ -1,39 +0,0 @@
import $ from 'jquery';
import _ from 'lodash';
import coreModule from '../core_module';
/** @ngInject */
function dashClass($timeout) {
return {
link: ($scope, elem) => {
const body = $('body');
$scope.ctrl.dashboard.events.on('view-mode-changed', panel => {
console.log('view-mode-changed', panel.fullscreen);
if (panel.fullscreen) {
body.addClass('panel-in-fullscreen');
} else {
$timeout(() => {
body.removeClass('panel-in-fullscreen');
});
}
});
body.toggleClass('panel-in-fullscreen', $scope.ctrl.dashboard.meta.fullscreen === true);
$scope.$watch('ctrl.dashboardViewState.state.editview', newValue => {
if (newValue) {
elem.toggleClass('dashboard-page--settings-opening', _.isString(newValue));
setTimeout(() => {
elem.toggleClass('dashboard-page--settings-open', _.isString(newValue));
}, 10);
} else {
elem.removeClass('dashboard-page--settings-opening');
elem.removeClass('dashboard-page--settings-open');
}
});
},
};
}
coreModule.directive('dashClass', dashClass);

View File

@ -340,6 +340,11 @@ export function makeSeriesForLogs(rows: LogRowModel[], intervalMs: number): Time
return a[1] - b[1];
});
return { datapoints: series.datapoints, target: series.alias, color: series.color };
return {
datapoints: series.datapoints,
target: series.alias,
alias: series.alias,
color: series.color,
};
});
}

View File

@ -9,6 +9,7 @@ export const initialState: LocationState = {
query: {},
routeParams: {},
replace: false,
lastUpdated: 0,
};
export const locationReducer = (state = initialState, action: Action): LocationState => {
@ -28,6 +29,7 @@ export const locationReducer = (state = initialState, action: Action): LocationS
query: { ...query },
routeParams: routeParams || state.routeParams,
replace: replace === true,
lastUpdated: new Date().getTime(),
};
}
}

View File

@ -139,6 +139,10 @@ export class KeybindingSrv {
);
}
unbind(keyArg: string, keyType?: string) {
Mousetrap.unbind(keyArg, keyType);
}
showDashEditView() {
const search = _.extend(this.$location.search(), { editview: 'settings' });
this.$location.search(search);
@ -291,3 +295,17 @@ export class KeybindingSrv {
}
coreModule.service('keybindingSrv', KeybindingSrv);
/**
* Code below exports the service to react components
*/
let singletonInstance: KeybindingSrv;
export function setKeybindingSrv(instance: KeybindingSrv) {
singletonInstance = instance;
}
export function getKeybindingSrv(): KeybindingSrv {
return singletonInstance;
}

View File

@ -8,6 +8,7 @@ import {
} from './explore';
import { ExploreUrlState } from 'app/types/explore';
import store from 'app/core/store';
import { LogsDedupStrategy } from 'app/core/logs_model';
const DEFAULT_EXPLORE_STATE: ExploreUrlState = {
datasource: null,
@ -17,7 +18,8 @@ const DEFAULT_EXPLORE_STATE: ExploreUrlState = {
showingGraph: true,
showingTable: true,
showingLogs: true,
}
dedupStrategy: LogsDedupStrategy.none,
},
};
describe('state functions', () => {
@ -78,7 +80,7 @@ describe('state functions', () => {
expect(serializeStateToUrlParam(state)).toBe(
'{"datasource":"foo","queries":[{"expr":"metric{test=\\"a/b\\"}"},' +
'{"expr":"super{foo=\\"x/z\\"}"}],"range":{"from":"now-5h","to":"now"},' +
'"ui":{"showingGraph":true,"showingTable":true,"showingLogs":true}}'
'"ui":{"showingGraph":true,"showingTable":true,"showingLogs":true,"dedupStrategy":"none"}}'
);
});
@ -100,7 +102,7 @@ describe('state functions', () => {
},
};
expect(serializeStateToUrlParam(state, true)).toBe(
'["now-5h","now","foo",{"expr":"metric{test=\\"a/b\\"}"},{"expr":"super{foo=\\"x/z\\"}"},{"ui":[true,true,true]}]'
'["now-5h","now","foo",{"expr":"metric{test=\\"a/b\\"}"},{"expr":"super{foo=\\"x/z\\"}"},{"ui":[true,true,true,"none"]}]'
);
});
});

View File

@ -21,6 +21,7 @@ import {
QueryIntervals,
QueryOptions,
} from 'app/types/explore';
import { LogsDedupStrategy } from 'app/core/logs_model';
export const DEFAULT_RANGE = {
from: 'now-6h',
@ -31,6 +32,7 @@ export const DEFAULT_UI_STATE = {
showingTable: true,
showingGraph: true,
showingLogs: true,
dedupStrategy: LogsDedupStrategy.none,
};
const MAX_HISTORY_ITEMS = 100;
@ -183,6 +185,7 @@ export function parseUrlState(initial: string | undefined): ExploreUrlState {
showingGraph: segment.ui[0],
showingLogs: segment.ui[1],
showingTable: segment.ui[2],
dedupStrategy: segment.ui[3],
};
}
});
@ -204,7 +207,14 @@ export function serializeStateToUrlParam(urlState: ExploreUrlState, compact?: bo
urlState.range.to,
urlState.datasource,
...urlState.queries,
{ ui: [!!urlState.ui.showingGraph, !!urlState.ui.showingLogs, !!urlState.ui.showingTable] },
{
ui: [
!!urlState.ui.showingGraph,
!!urlState.ui.showingLogs,
!!urlState.ui.showingTable,
urlState.ui.dedupStrategy,
],
},
]);
}
return JSON.stringify(urlState);

View File

@ -0,0 +1,5 @@
import { memoize } from 'lodash';
import { createSelectorCreator } from 'reselect';
const hashFn = (...args) => args.reduce((acc, val) => acc + '-' + JSON.stringify(val), '');
export const createLodashMemoizedSelector = createSelectorCreator(memoize, hashFn);

View File

@ -10,7 +10,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="update()" ng-show="!createMode">Update</button>
<button type="submit" class="btn btn-primary" ng-click="update()" ng-show="!createMode">Update</button>
</div>
</form>

View File

@ -21,7 +21,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="update()" ng-show="!createMode">Update</button>
<button type="submit" class="btn btn-primary" ng-click="update()" ng-show="!createMode">Update</button>
</div>
</form>
@ -34,7 +34,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="setPassword()">Update</button>
<button type="submit" class="btn btn-primary" ng-click="setPassword()">Update</button>
</div>
</form>
@ -46,7 +46,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="updatePermissions()">Update</button>
<button type="submit" class="btn btn-primary" ng-click="updatePermissions()">Update</button>
</div>
</form>
@ -65,7 +65,7 @@
</span>
</div>
<div class="gf-form">
<button class="btn btn-success gf-form-btn" ng-click="addOrgUser()">Add</button>
<button class="btn btn-primary gf-form-btn" ng-click="addOrgUser()">Add</button>
</div>
</div>
</form>

View File

@ -24,7 +24,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="create()">Create</button>
<button type="submit" class="btn btn-primary" ng-click="create()">Create</button>
</div>
</form>
</div>

View File

@ -3,7 +3,7 @@
<div class="page-container page-body">
<div class="page-action-bar">
<div class="page-action-bar__spacer"></div>
<a class="page-header__cta btn btn-success" href="org/new">
<a class="page-header__cta btn btn-primary" href="org/new">
<i class="fa fa-plus"></i>
New Org
</a>

View File

@ -7,7 +7,7 @@
<i class="gf-form-input-icon fa fa-search"></i>
</label>
<div class="page-action-bar__spacer"></div>
<a class="btn btn-success" href="admin/users/create">
<a class="btn btn-primary" href="admin/users/create">
<i class="fa fa-plus"></i>
Add new user
</a>

View File

@ -68,7 +68,7 @@
</div>
<div class="gf-form-group gf-form-button-row">
<button type="submit" ng-click="ctrl.save()" class="btn btn-success width-7">Save</button>
<button type="submit" ng-click="ctrl.save()" class="btn btn-primary width-7">Save</button>
<button type="submit" ng-click="ctrl.testNotification()" class="btn btn-secondary width-7">Send Test</button>
<a href="alerting/notifications" class="btn btn-inverse">Back</a>
</div>

View File

@ -7,8 +7,7 @@
<div class="page-action-bar__spacer">
</div>
<a href="alerting/notification/new" class="btn btn-success">
<i class="fa fa-plus"></i>
<a href="alerting/notification/new" class="btn btn-primary">
New Channel
</a>
</div>

View File

@ -12,3 +12,4 @@ import './manage-dashboards';
import './teams/CreateTeamCtrl';
import './profile/all';
import './datasources/settings/HttpSettingsCtrl';
import './datasources/settings/TlsAuthSettingsCtrl';

View File

@ -9,7 +9,7 @@
<div ng-if="ctrl.mode === 'list'">
<div class="page-action-bar" ng-if="ctrl.annotations.length > 1">
<div class="page-action-bar__spacer"></div>
<a type="button" class="btn btn-success" ng-click="ctrl.setupNew();"><i class="fa fa-plus" ></i> New</a>
<a type="button" class="btn btn-primary" ng-click="ctrl.setupNew();"><i class="fa fa-plus" ></i> New</a>
</div>
<table class="filter-table filter-table--hover">
@ -48,7 +48,7 @@
<div ng-if="ctrl.annotations.length === 1" class="p-t-2">
<div class="empty-list-cta">
<div class="empty-list-cta__title">There are no custom annotation queries added yet</div>
<a ng-click="ctrl.setupNew()" class="empty-list-cta__button btn btn-xlarge btn-success">
<a ng-click="ctrl.setupNew()" class="empty-list-cta__button btn btn-xlarge btn-primary">
<i class="gicon gicon-add-annotation"></i>
Add Annotation Query
</a>
@ -105,8 +105,8 @@
<div class="gf-form">
<div class="gf-form-button-row p-y-0">
<button ng-show="ctrl.mode === 'new'" type="button" class="btn gf-form-button btn-success" ng-click="ctrl.add()">Add</button>
<button ng-show="ctrl.mode === 'edit'" type="button" class="btn btn-success pull-left" ng-click="ctrl.update()">Update</button>
<button ng-show="ctrl.mode === 'new'" type="button" class="btn gf-form-button btn-primary" ng-click="ctrl.add()">Add</button>
<button ng-show="ctrl.mode === 'edit'" type="button" class="btn btn-primary pull-left" ng-click="ctrl.update()">Update</button>
</div>
</div>
</div>

View File

@ -26,7 +26,7 @@
</div>
<div class="gf-form-button-row">
<button type="submit" class="btn btn-success" ng-click="ctrl.save()">Save</button>
<button type="submit" class="btn btn-primary" ng-click="ctrl.save()">Save</button>
<button ng-if="ctrl.event.id" type="submit" class="btn btn-danger" ng-click="ctrl.delete()">Delete</button>
<a class="btn-text" ng-click="ctrl.close();">Cancel</a>
</div>

View File

@ -107,7 +107,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
renderEmptyList() {
const { isAdding } = this.state;
return (
<div className="page-container page-body">
<>
{!isAdding && (
<EmptyListCTA
model={{
@ -124,7 +124,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
/>
)}
{this.renderAddApiKeyForm()}
</div>
</>
);
}
@ -169,7 +169,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
</span>
</div>
<div className="gf-form">
<button className="btn gf-form-btn btn-success">Add</button>
<button className="btn gf-form-btn btn-primary">Add</button>
</div>
</div>
</form>
@ -183,7 +183,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
const { apiKeys, searchQuery } = this.props;
return (
<div className="page-container page-body">
<>
<div className="page-action-bar">
<div className="gf-form gf-form--grow">
<label className="gf-form--has-input-icon gf-form--grow">
@ -199,8 +199,8 @@ export class ApiKeysPage extends PureComponent<Props, any> {
</div>
<div className="page-action-bar__spacer" />
<button className="btn btn-success pull-right" onClick={this.onToggleAdding} disabled={isAdding}>
<i className="fa fa-plus" /> Add API Key
<button className="btn btn-primary pull-right" onClick={this.onToggleAdding} disabled={isAdding}>
Add API Key
</button>
</div>
@ -231,7 +231,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
</tbody>
) : null}
</table>
</div>
</>
);
}
@ -241,13 +241,7 @@ export class ApiKeysPage extends PureComponent<Props, any> {
return (
<Page navModel={navModel}>
<Page.Contents isLoading={!hasFetched}>
{hasFetched && (
apiKeysCount > 0 ? (
this.renderApiKeyList()
) : (
this.renderEmptyList()
)
)}
{hasFetched && (apiKeysCount > 0 ? this.renderApiKeyList() : this.renderEmptyList())}
</Page.Contents>
</Page>
);

View File

@ -35,118 +35,114 @@ exports[`Render should render CTA if there are no API keys 1`] = `
<PageContents
isLoading={false}
>
<div
className="page-container page-body"
>
<EmptyListCTA
model={
Object {
"buttonIcon": "fa fa-plus",
"buttonLink": "#",
"buttonTitle": " New API Key",
"onClick": [Function],
"proTip": "Remember you can provide view-only API access to other applications.",
"proTipLink": "",
"proTipLinkTitle": "",
"proTipTarget": "_blank",
"title": "You haven't added any API Keys yet.",
}
<EmptyListCTA
model={
Object {
"buttonIcon": "fa fa-plus",
"buttonLink": "#",
"buttonTitle": " New API Key",
"onClick": [Function],
"proTip": "Remember you can provide view-only API access to other applications.",
"proTipLink": "",
"proTipLinkTitle": "",
"proTipTarget": "_blank",
"title": "You haven't added any API Keys yet.",
}
/>
<Component
in={false}
}
/>
<Component
in={false}
>
<div
className="cta-form"
>
<div
className="cta-form"
<button
className="cta-form__close btn btn-transparent"
onClick={[Function]}
>
<button
className="cta-form__close btn btn-transparent"
onClick={[Function]}
>
<i
className="fa fa-close"
/>
</button>
<h5>
Add API Key
</h5>
<form
className="gf-form-group"
onSubmit={[Function]}
<i
className="fa fa-close"
/>
</button>
<h5>
Add API Key
</h5>
<form
className="gf-form-group"
onSubmit={[Function]}
>
<div
className="gf-form-inline"
>
<div
className="gf-form-inline"
className="gf-form max-width-21"
>
<div
className="gf-form max-width-21"
<span
className="gf-form-label"
>
<span
className="gf-form-label"
>
Key name
</span>
<input
className="gf-form-input"
Key name
</span>
<input
className="gf-form-input"
onChange={[Function]}
placeholder="Name"
type="text"
value=""
/>
</div>
<div
className="gf-form"
>
<span
className="gf-form-label"
>
Role
</span>
<span
className="gf-form-select-wrapper"
>
<select
className="gf-form-input gf-size-auto"
onChange={[Function]}
placeholder="Name"
type="text"
value=""
/>
</div>
<div
className="gf-form"
>
<span
className="gf-form-label"
value="Viewer"
>
Role
</span>
<span
className="gf-form-select-wrapper"
>
<select
className="gf-form-input gf-size-auto"
onChange={[Function]}
<option
key="Viewer"
label="Viewer"
value="Viewer"
>
<option
key="Viewer"
label="Viewer"
value="Viewer"
>
Viewer
</option>
<option
key="Editor"
label="Editor"
value="Editor"
>
Editor
</option>
<option
key="Admin"
label="Admin"
value="Admin"
>
Admin
</option>
</select>
</span>
</div>
<div
className="gf-form"
>
<button
className="btn gf-form-btn btn-success"
>
Add
</button>
</div>
Viewer
</option>
<option
key="Editor"
label="Editor"
value="Editor"
>
Editor
</option>
<option
key="Admin"
label="Admin"
value="Admin"
>
Admin
</option>
</select>
</span>
</div>
</form>
</div>
</Component>
</div>
<div
className="gf-form"
>
<button
className="btn gf-form-btn btn-primary"
>
Add
</button>
</div>
</div>
</form>
</div>
</Component>
</PageContents>
</Page>
`;

View File

@ -12,7 +12,7 @@
</gf-form-switch>
<div class="gf-form-button-row">
<button type="button" class="btn gf-form-btn width-10 btn-success" ng-click="ctrl.saveDashboardAsFile()">
<button type="button" class="btn gf-form-btn width-10 btn-primary" ng-click="ctrl.saveDashboardAsFile()">
<i class="fa fa-save"></i> Save to file
</button>
<button type="button" class="btn gf-form-btn width-10 btn-secondary" ng-click="ctrl.viewJson()">

View File

@ -10,7 +10,7 @@
<div class="empty-list-cta__title">
There are no dashboard links added yet
</div>
<a ng-click="ctrl.setupNew()" class="empty-list-cta__button btn btn-xlarge btn-success">
<a ng-click="ctrl.setupNew()" class="empty-list-cta__button btn btn-xlarge btn-primary">
<i class="gicon gicon-add-link"></i>
Add Dashboard Link
</a>
@ -26,7 +26,7 @@
<div ng-if="ctrl.dashboard.links.length > 0">
<div class="page-action-bar">
<div class="page-action-bar__spacer"></div>
<a type="button" class="btn btn-success" ng-click="ctrl.setupNew()">
<a type="button" class="btn btn-primary" ng-click="ctrl.setupNew()">
<i class="fa fa-plus"></i> New</a>
</div>
<table class="filter-table filter-table--hover">
@ -126,10 +126,10 @@
</div>
</div>
</div>
<button class="btn btn-success" ng-if="ctrl.mode == 'new'" ng-click="ctrl.addLink()">
<button class="btn btn-primary" ng-if="ctrl.mode == 'new'" ng-click="ctrl.addLink()">
Add
</button>
<button class="btn btn-success" ng-if="ctrl.mode == 'edit'" ng-click="ctrl.saveLink()">
<button class="btn btn-primary" ng-if="ctrl.mode == 'edit'" ng-click="ctrl.saveLink()">
Update
</button>
</div>

View File

@ -9,12 +9,13 @@ import { PlaylistSrv } from 'app/features/playlist/playlist_srv';
// Components
import { DashNavButton } from './DashNavButton';
import { Tooltip } from '@grafana/ui';
// State
import { updateLocation } from 'app/core/actions';
// Types
import { DashboardModel } from '../../state/DashboardModel';
import { DashboardModel } from '../../state';
export interface Props {
dashboard: DashboardModel;
@ -33,7 +34,6 @@ export class DashNav extends PureComponent<Props> {
constructor(props: Props) {
super(props);
this.playlistSrv = this.props.$injector.get('playlistSrv');
}
@ -123,26 +123,54 @@ export class DashNav extends PureComponent<Props> {
});
};
render() {
const { dashboard, isFullscreen, editview, onAddPanel } = this.props;
const { canStar, canSave, canShare, folderTitle, showSettings, isStarred } = dashboard.meta;
const { snapshot } = dashboard;
renderDashboardTitleSearchButton() {
const { dashboard } = this.props;
const folderTitle = dashboard.meta.folderTitle;
const haveFolder = dashboard.meta.folderId > 0;
const snapshotUrl = snapshot && snapshot.originalUrl;
return (
<div className="navbar">
<>
<div>
<a className="navbar-page-btn" onClick={this.onOpenSearch}>
<i className="gicon gicon-dashboard" />
{!this.isInFullscreenOrSettings && <i className="gicon gicon-dashboard" />}
{haveFolder && <span className="navbar-page-btn--folder">{folderTitle} / </span>}
{dashboard.title}
<i className="fa fa-caret-down" />
</a>
</div>
<div className="navbar__spacer" />
</>
);
}
get isInFullscreenOrSettings() {
return this.props.editview || this.props.isFullscreen;
}
renderBackButton() {
return (
<div className="navbar-edit">
<Tooltip content="Go back (Esc)">
<button className="navbar-edit__back-btn" onClick={this.onClose}>
<i className="fa fa-arrow-left" />
</button>
</Tooltip>
</div>
);
}
render() {
const { dashboard, onAddPanel } = this.props;
const { canStar, canSave, canShare, showSettings, isStarred } = dashboard.meta;
const { snapshot } = dashboard;
const snapshotUrl = snapshot && snapshot.originalUrl;
return (
<div className="navbar">
{this.isInFullscreenOrSettings && this.renderBackButton()}
{this.renderDashboardTitleSearchButton()}
{this.playlistSrv.isPlaying && (
<div className="navbar-buttons navbar-buttons--playlist">
@ -228,17 +256,6 @@ export class DashNav extends PureComponent<Props> {
</div>
<div className="gf-timepicker-nav" ref={element => (this.timePickerEl = element)} />
{(isFullscreen || editview) && (
<div className="navbar-buttons navbar-buttons--close">
<DashNavButton
tooltip="Back to dashboard"
classSuffix="primary"
icon="fa fa-reply"
onClick={this.onClose}
/>
</div>
)}
</div>
);
}

View File

@ -76,8 +76,8 @@ export class DashboardPermissions extends PureComponent<Props, State> {
</div>
</Tooltip>
<div className="page-action-bar__spacer" />
<button className="btn btn-success pull-right" onClick={this.onOpenAddPermissions} disabled={isAdding}>
<i className="fa fa-plus" /> Add Permission
<button className="btn btn-primary pull-right" onClick={this.onOpenAddPermissions} disabled={isAdding}>
Add Permission
</button>
</div>
</div>

View File

@ -38,7 +38,7 @@ export class SettingsCtrl {
});
});
this.canSaveAs = this.dashboard.meta.canEdit && contextSrv.hasEditPermissionInFolders;
this.canSaveAs = contextSrv.hasEditPermissionInFolders;
this.canSave = this.dashboard.meta.canSave;
this.canDelete = this.dashboard.meta.canSave;

View File

@ -10,7 +10,7 @@
</a>
<div class="dashboard-settings__aside-actions">
<button class="btn btn-success" ng-click="ctrl.saveDashboard()" ng-show="ctrl.canSave">
<button class="btn btn-primary" ng-click="ctrl.saveDashboard()" ng-show="ctrl.canSave">
<i class="fa fa-save"></i> Save
</button>
<button class="btn btn-inverse" ng-click="ctrl.openSaveAsModal()" ng-show="ctrl.canSaveAs">
@ -100,7 +100,7 @@
</div>
<div class="gf-form-button-row">
<button class="btn btn-success" ng-click="ctrl.saveDashboardJson()" ng-show="ctrl.canSave">
<button class="btn btn-primary" ng-click="ctrl.saveDashboardJson()" ng-show="ctrl.canSave">
<i class="fa fa-save"></i> Save Changes
</button>
</div>
@ -128,7 +128,7 @@
<div class="dashboard-settings__content" ng-if="ctrl.viewId === 'make_editable'">
<h3 class="dashboard-settings__header">Make Editable</h3>
<button class="btn btn-success" ng-click="ctrl.makeEditable()">
<button class="btn btn-primary" ng-click="ctrl.makeEditable()">
Make Editable
</button>
</div>

Some files were not shown because too many files have changed in this diff Show More