* BarGauge: New value options
* Fix typings for cell options, add new value mode option for bar gauge cells
* Add BarGauge panel option, tests, and update test dashboard
* Updated
* Added default
* Goodbye trusty console.log
* Update
* Merge changes from main
* Update docs
* Add valuemode doc changes
* Update gdev dashboard
* Update valueMode symbol name to valueDisplayMode
* Use Enums as Opposed to literals, don't calculate values when hidden
* Remove double import
* Fix tests
* One more test fix
* Remove erroneous targets field, fix type of maxDataPoints
* Strip nulls and add index field to Thresholds
* Gen cue
* remove bad targets again
* Fixes
---------
Co-authored-by: Kyle Cunningham <kyle@codeincarnate.com>
Co-authored-by: sam boyer <sdboyer@grafana.com>
* bring in source from database
* bring in transformations from database
* add regex transformations to scopevar
* Consolidate types, add better example, cleanup
* Add var only if match
* Change ScopedVar to not require text, do not leak transformation-made variables between links
* Add mappings and start implementing logfmt
* Add mappings and start implementing logfmt
* Remove mappings, turn off global regex
* Add example yaml and omit transformations if empty
* Fix the yaml
* Add logfmt transformation
* Cleanup transformations and yaml
* add transformation field to FE types and use it, safeStringify logfmt values
* Add tests, only safe stringify if non-string, fix bug with safe stringify where it would return empty string with false value
* Add test for transformation field
* Do not add null transformations object
* Break out transformation logic, add tests to backend code
* Fix lint errors I understand 😅
* Fix the backend lint error
* Remove unnecessary code and mark new Transformations object as internal
* Add support for named capture groups
* Remove type assertion
* Remove variable name from transformation
* Add test for overriding regexes
* Add back variable name field, but change to mapValue
* fix go api test
* Change transformation types to enum, add better provisioning checks for bad type name and format
* Check for expression with regex transformations
* Remove Result field from AddDataSourceCommand
* Remove DatasourcesPermissionFilterQuery Result
* Remove GetDataSourceQuery Result
* Remove GetDataSourcesByTypeQuery Result
* Remove GetDataSourcesQuery Result
* Remove GetDefaultDataSourceQuery Result
* Remove UpdateDataSourceCommand Result
* add bundle registry service to avoid dependency cycles
* move user support bundle collector to user service
* move usage stat bundle implementation to usage stats
* add info for background service
* fix remaining imports
* whitespace
* Initial schema
- Add types based off of current frontend
* Rename and field-level comments
* Update report and regenerate files
* Rename frontend Azure folder
- Doing this for consistency and to ensure code-generation works
- Update betterer results due to file renames
* Remove default and add back enum vals that I deleted
* Set workspace prop as optional
* Replace template variable types
* Connect frontend query types
- Keep properties optional for now to avoid major changes
- Rename AzureMetricResource
- Correctly use ResultFormat
* Add TSVeneer decorator
* Update schema
* Update type
* Update CODEOWNERS
* Fix gen-cue issue
* Fix backend test
* Fix e2e test
* Update code coverage
* Remove references to old Azure Monitor path
* Review
* Regen files
* Add is_paused attr to the POST alert rule group endpoint
* Add is_paused to alerting API POST alert rule group
* Fixed tests
* Add is_paused to alerting gettable endpoints
* Fix integration tests
* Alerting: allow to pause existing rules (#62401)
* Display Pause Rule switch in Editing Rule form
* add isPaused property to form interface and dto
* map isPaused prop with is_paused value from DTO
Also update test snapshots
* Append '(Paused)' text on alert list state column when appropriate
* Change Switch styles according to discussion with UX
Also adding a tooltip with info what this means
* Adjust styles
* Fix alignment and isPaused type definition
Co-authored-by: gillesdemey <gilles.de.mey@gmail.com>
* Fix test
* Fix test
* Fix RuleList test
---------
Co-authored-by: gillesdemey <gilles.de.mey@gmail.com>
* wip
* Fix tests and add comments to clarify AlertRuleWithOptionals
* Fix one more test
* Fix tests
* Fix typo in comment
* Fix alert rule(s) cannot be paused via API
* Add integration tests for alerting api pausing flow
* Remove duplicated integration test
---------
Co-authored-by: Virginia Cepeda <virginia.cepeda@grafana.com>
Co-authored-by: gillesdemey <gilles.de.mey@gmail.com>
Co-authored-by: George Robinson <george.robinson@grafana.com>
* Use suggested value for uid
* update the snapshot
* use __expr__
* replace all -100 with __expr__
* update snapshot
* more changes
* revert redundant change
* Use expr.DatasourceUID where it's possible
* generate files
* schematize data query
* add the stuff you dingus
* feat(testdatasource): add scenario to generated types
* use generated testdata query in frontend
* update code owners
* Add path exception for testdata datasource
* use specific numeric data types
* fix test
* fix e2e smoketest
* add test data query type
* use test data query type
* fix betterer
* Fix typo
* move to experimental
Co-authored-by: Alex Khomenko <Clarity-89@users.noreply.github.com>
Co-authored-by: Jack Westbrook <jack.westbrook@gmail.com>
Co-authored-by: Marcus Efraimsson <marcus.efraimsson@gmail.com>
Co-authored-by: sam boyer <sdboyer@grafana.com>
Co-authored-by: Alex Khomenko <Clarity-89@users.noreply.github.com>
* Alerting: Fix Test Receivers when settings are non-strings
As part of the Alerting extraction, we want to make sure we don't have circular depedencies. As such, I had to move `PostableGrafanaReceiver` to a new struct in `grafana/alerting` called `GrafanaReceiver`.
`PostableGrafanaReceiver` has an attribute called `Settings` that uses a Grafana-propietary struct called `RawMessage`, this struct shadows `json.RawMessage`.
When I created `GrafanaReceiver`, I turned settings into a `map[string]string` thinking all settings would end up as strings. This was a mistake, and this test proves that it doesn't work, and breaks the API.
* Return errors from data parsing
* Better error handling
* Fix the tests
* When there is no frame add empty frame to get metadata attached to it
* Fix tests
* Update testdata
Automatically forward core plugin request HTTP headers in outgoing HTTP requests.
Core datasource plugin authors don't have to specifically handle forwarding of HTTP
headers, e.g. do not have to "hardcode" the header-names in the datasource plugin,
if not having custom needs.
Fixes#57065
* Implement backtesting engine that can process regular rule specification (with queries to datasource) as well as special kind of rules that have data frame instead of query.
* declare a new API endpoint and model
* add feature toggle `alertingBacktesting`
* Move truncation code to util to mirror upstream
* Resolve merge conflicts
* Align logging of alert key
* Update tests and fix field passing bug
* Remove superfluous newline in test now that we trim whitespace
* Uptake minor log changes from upstream
Removes request/response connection/hop headers for call resource in similar
manner as Go's reverse proxy functions. Also removes Prometheus datasource
custom call resource header manipulation in regards to hop-by-hop headers.
Fixes#60076
Ref #58646
Co-authored-by: Will Browne <wbrowne@users.noreply.github.com>
This commit makes a number of changes to how images work in Slack
notifications.
It adds support for uploading images to Slack via the files.upload
API when the contact point has a token. Images are no longer linked
via a URL if a token is present.
Each image uploaded to Slack is posted as a reply to the original
notification. Up to maxImagesPerThreadTs images can be posted as
replies before a final message is sent with:
There are no images than can be shown here. To see the panels for
all firing and resolved alerts please check Grafana
Incoming Webhooks cannot upload files via files.upload and so webhooks
require the image to be uploaded to cloud storage and linked via URL.
Adding support for backend plugin client middlewares. This allows headers in outgoing
backend plugin and HTTP requests to be modified using client middlewares.
The following client middlewares added:
Forward cookies: Will forward incoming HTTP request Cookies to outgoing plugins.Client
and HTTP requests if the datasource has enabled forwarding of cookies (keepCookies).
Forward OAuth token: Will set OAuth token headers on outgoing plugins.Client and HTTP
requests if the datasource has enabled Forward OAuth Identity (oauthPassThru).
Clear auth headers: Will clear any outgoing HTTP headers that was part of the incoming
HTTP request and used when authenticating to Grafana.
The current suggested way to register client middlewares is to have a separate package,
pluginsintegration, responsible for bootstrap/instantiate the backend plugin client with
middlewares and/or longer term bootstrap/instantiate plugin management.
Fixes#54135
Related to #47734
Related to #57870
Related to #41623
Related to #57065
* Introduce a new feature flag for prometheus buffered client
* Use querydata client as default and put buffered client behind the feature flag
* Remove prometheusStreamingJSONParser feature flag as it is not needed anymore
* Update tests
* Fix unit tests
* Update feature flag description
* Remove URL-based alertmanagers from endpoint config
* WIP
* Add migration and alertmanagers from admin_configuration
* Empty comment removed
* set BasicAuth true when user is present in url
* Remove Alertmanagers from GET /admin_config payload
* Remove URL-based alertmanager configuration from UI
* Fix new uid generation in external alertmanagers migration
* Fix tests for URL-based external alertmanagers
* Fix API tests
* Add more tests, move migration code to separate file, and remove possible am duplicate urls
* Fix edge cases in migration
* Fix imports
* Remove useless fields and fix created_at/updated_at retrieval
Co-authored-by: George Robinson <george.robinson@grafana.com>
Co-authored-by: Konrad Lalik <konrad.lalik@grafana.com>