CI: Lint starlark files with buildifier (#59157)

* Add verify-starlark build action that returns an error for starlark files with lint

Relies on `buildifier` tool.

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Add verify_starlark_step to PR pipeline

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Manually fetch buildifier in curl_image until a new build_image is created

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Format with buildifier

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Remove all unused variables retaining one unused function

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Use snake_case for variable

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Replace deprecated dictionary concatenation with .update() method

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Start adding docstrings for all modules and functions

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Prefer os.WriteFile as ioutil.WriteFile has been deprecated since go 1.16

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Attempt to document the behavior of the init_enterprise_step

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document test_backend pipeline

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document enterprise_downstream_step

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document the pipeline utility function

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document publish_images_step

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document publish_images_steps

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document enterprise2_pipelines function

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Add tags table for Starlark files.

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document test_frontend

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document windows function

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Add docstrings to verifystarlark functions

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Refactor error handling to be more clear and document complex behavior

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Split errors into execution errors and verification errors

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document all other library functions

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Add local variables to TAGS

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Add blank line between all Args and Returns sections

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Fix new linting errors

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Lint new Starlark files

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Correct buildifier binary mv

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Document the need to set nofile ulimit to at least 2048

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Update build-container to include buildifier

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Ensure buildifier binary is executable

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Fix valid content test

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Simply return execution error

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Only check files rather than fixing them

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Use updated build-container with executable buildifier

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Test that context cancellation stops execution

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Simplify error handling

Return execution errors that short circuit WalkDir rather than
separately tracking that error.

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Remove fetching of buildifier binary now that it is in the build-container

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Use build image in verify-starlark step

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Use semver tag

The image is the same but uses a semver tag to make it clearer that
this is a forward upgrade from the old version.

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

* Use node 18 image with buildifier

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>

---------

Signed-off-by: Jack Baldry <jack.baldry@grafana.com>
This commit is contained in:
Jack Baldry 2023-01-30 09:27:11 +00:00 committed by GitHub
parent 907e2a840e
commit 8379a5338c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
33 changed files with 3613 additions and 2223 deletions

View File

@ -3,54 +3,55 @@
# 2. Login to drone and export the env variables (token and server) shown here: https://drone.grafana.net/account
# 3. Run `make drone`
# More information about this process here: https://github.com/grafana/deployment_tools/blob/master/docs/infrastructure/drone/signing.md
"""
This module returns a Drone configuration including pipelines and secrets.
"""
load('scripts/drone/events/pr.star', 'pr_pipelines')
load('scripts/drone/events/main.star', 'main_pipelines')
load('scripts/drone/pipelines/docs.star', 'docs_pipelines')
load("scripts/drone/events/pr.star", "pr_pipelines")
load("scripts/drone/events/main.star", "main_pipelines")
load(
'scripts/drone/events/release.star',
'oss_pipelines',
'enterprise_pipelines',
'enterprise2_pipelines',
'publish_artifacts_pipelines',
'publish_npm_pipelines',
'publish_packages_pipeline',
'artifacts_page_pipeline',
"scripts/drone/events/release.star",
"artifacts_page_pipeline",
"enterprise2_pipelines",
"enterprise_pipelines",
"oss_pipelines",
"publish_artifacts_pipelines",
"publish_npm_pipelines",
"publish_packages_pipeline",
)
load(
'scripts/drone/pipelines/publish_images.star',
'publish_image_pipelines_public',
'publish_image_pipelines_security',
"scripts/drone/pipelines/publish_images.star",
"publish_image_pipelines_public",
"publish_image_pipelines_security",
)
load('scripts/drone/pipelines/github.star', 'publish_github_pipeline')
load('scripts/drone/pipelines/aws_marketplace.star', 'publish_aws_marketplace_pipeline')
load('scripts/drone/version.star', 'version_branch_pipelines')
load('scripts/drone/events/cron.star', 'cronjobs')
load('scripts/drone/vault.star', 'secrets')
load("scripts/drone/pipelines/github.star", "publish_github_pipeline")
load("scripts/drone/pipelines/aws_marketplace.star", "publish_aws_marketplace_pipeline")
load("scripts/drone/version.star", "version_branch_pipelines")
load("scripts/drone/events/cron.star", "cronjobs")
load("scripts/drone/vault.star", "secrets")
def main(ctx):
def main(_ctx):
return (
pr_pipelines()
+ main_pipelines()
+ oss_pipelines()
+ enterprise_pipelines()
+ enterprise2_pipelines()
+ enterprise2_pipelines(
prefix='custom-',
trigger={'event': ['custom']},
)
+ publish_image_pipelines_public()
+ publish_image_pipelines_security()
+ publish_github_pipeline('public')
+ publish_github_pipeline('security')
+ publish_aws_marketplace_pipeline('public')
+ publish_artifacts_pipelines('security')
+ publish_artifacts_pipelines('public')
+ publish_npm_pipelines()
+ publish_packages_pipeline()
+ artifacts_page_pipeline()
+ version_branch_pipelines()
+ cronjobs()
+ secrets()
pr_pipelines() +
main_pipelines() +
oss_pipelines() +
enterprise_pipelines() +
enterprise2_pipelines() +
enterprise2_pipelines(
prefix = "custom-",
trigger = {"event": ["custom"]},
) +
publish_image_pipelines_public() +
publish_image_pipelines_security() +
publish_github_pipeline("public") +
publish_github_pipeline("security") +
publish_aws_marketplace_pipeline("public") +
publish_artifacts_pipelines("security") +
publish_artifacts_pipelines("public") +
publish_npm_pipelines() +
publish_packages_pipeline() +
artifacts_page_pipeline() +
version_branch_pipelines() +
cronjobs() +
secrets()
)

File diff suppressed because it is too large Load Diff

View File

@ -239,8 +239,12 @@ drone: $(DRONE)
$(DRONE) lint .drone.yml --trusted
$(DRONE) --server https://drone.grafana.net sign --save grafana/grafana
# Generate an Emacs tags table (https://www.gnu.org/software/emacs/manual/html_node/emacs/Tags-Tables.html) for Starlark files.
scripts/drone/TAGS: $(shell find scripts/drone -name '*.star')
etags --lang none --regex="/def \(\w+\)[^:]+:/\1/" --regex="/\s*\(\w+\) =/\1/" $^ -o $@
format-drone:
black --include '\.star$$' -S scripts/drone/ .drone.star
buildifier -r scripts/drone
help: ## Display this help.
@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m<target>\033[0m\n"} /^[a-zA-Z_-]+:.*?##/ { printf " \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)

View File

@ -132,6 +132,12 @@ func main() {
Usage: "Verify Drone configuration",
Action: VerifyDrone,
},
{
Name: "verify-starlark",
Usage: "Verify Starlark configuration",
ArgsUsage: "<workspace path>",
Action: VerifyStarlark,
},
{
Name: "export-version",
Usage: "Exports version in dist/grafana.version",

View File

@ -0,0 +1,142 @@
package main
import (
"context"
"errors"
"fmt"
"io/fs"
"os/exec"
"path/filepath"
"strings"
"github.com/urfave/cli/v2"
)
func mapSlice[I any, O any](a []I, f func(I) O) []O {
o := make([]O, len(a))
for i, e := range a {
o[i] = f(e)
}
return o
}
// VerifyStarlark is the CLI Action for verifying Starlark files in a workspace.
// It expects a single context argument which is the path to the workspace.
// The actual verification procedure can return multiple errors which are
// joined together to be one holistic error for the action.
func VerifyStarlark(c *cli.Context) error {
if c.NArg() != 1 {
var message string
if c.NArg() == 0 {
message = "ERROR: missing required argument <workspace path>"
}
if c.NArg() > 1 {
message = "ERROR: too many arguments"
}
if err := cli.ShowSubcommandHelp(c); err != nil {
return err
}
return cli.Exit(message, 1)
}
workspace := c.Args().Get(0)
verificationErrs, executionErr := verifyStarlark(c.Context, workspace, buildifierLintCommand)
if executionErr != nil {
return executionErr
}
if len(verificationErrs) == 0 {
return nil
}
noun := "file"
if len(verificationErrs) > 1 {
noun += "s"
}
return fmt.Errorf("verification failed for %d %s:\n%s",
len(verificationErrs),
noun,
strings.Join(
mapSlice(verificationErrs, func(e error) string { return e.Error() }),
"\n",
))
}
type commandFunc = func(path string) (command string, args []string)
func buildifierLintCommand(path string) (string, []string) {
return "buildifier", []string{"-lint", "warn", "-mode", "check", path}
}
// verifyStarlark walks all directories starting at provided workspace path and
// verifies any Starlark files it finds.
// Starlark files are assumed to end with the .star extension.
// The verification relies on linting frovided by the 'buildifier' binary which
// must be in the PATH.
// A slice of verification errors are returned, one for each file that failed verification.
// If any execution of the `buildifier` command fails, this is returned separately.
// commandFn is executed on every Starlark file to determine the command and arguments to be executed.
// The caller is trusted and it is the callers responsibility to ensure that the resulting command is safe to execute.
func verifyStarlark(ctx context.Context, workspace string, commandFn commandFunc) ([]error, error) {
var verificationErrs []error
// All errors from filepath.WalkDir are filtered by the fs.WalkDirFunc.
// Lstat or ReadDir errors are reported as verificationErrors.
// If any execution of the `buildifier` command fails or if the context is cancelled,
// it is reported as an error and any verification of subsequent files is skipped.
err := filepath.WalkDir(workspace, func(path string, d fs.DirEntry, err error) error {
// Skip verification of the file or files within the directory if there is an error
// returned by Lstat or ReadDir.
if err != nil {
verificationErrs = append(verificationErrs, err)
return nil
}
if d.IsDir() {
return nil
}
if filepath.Ext(path) == ".star" {
command, args := commandFn(path)
// The caller is trusted.
//nolint:gosec
cmd := exec.CommandContext(ctx, command, args...)
cmd.Dir = workspace
_, err = cmd.Output()
if err == nil { // No error, early return.
return nil
}
// The error returned from cmd.Output() is never wrapped.
//nolint:errorlint
if err, ok := err.(*exec.ExitError); ok {
switch err.ExitCode() {
// Case comments are informed by the output of `buildifier --help`
case 1: // syntax errors in input
verificationErrs = append(verificationErrs, errors.New(string(err.Stderr)))
return nil
case 2: // usage errors: invoked incorrectly
return fmt.Errorf("command %q: %s", cmd, err.Stderr)
case 3: // unexpected runtime errors: file I/O problems or internal bugs
return fmt.Errorf("command %q: %s", cmd, err.Stderr)
case 4: // check mode failed (reformat is needed)
verificationErrs = append(verificationErrs, errors.New(string(err.Stderr)))
return nil
default:
return fmt.Errorf("command %q: %s", cmd, err.Stderr)
}
}
// Error was not an exit error from the command.
return fmt.Errorf("command %q: %v", cmd, err)
}
return nil
})
return verificationErrs, err
}

View File

@ -0,0 +1,135 @@
package main
import (
"context"
"os"
"path/filepath"
"strings"
"testing"
"github.com/stretchr/testify/require"
)
func TestVerifyStarlark(t *testing.T) {
t.Run("execution errors", func(t *testing.T) {
t.Run("invalid usage", func(t *testing.T) {
ctx := context.Background()
workspace := t.TempDir()
err := os.WriteFile(filepath.Join(workspace, "ignored.star"), []byte{}, os.ModePerm)
if err != nil {
t.Fatalf(err.Error())
}
_, executionErr := verifyStarlark(ctx, workspace, func(string) (string, []string) { return "buildifier", []string{"--invalid"} })
if executionErr == nil {
t.Fatalf("Expected execution error but got none")
}
})
t.Run("context cancellation", func(t *testing.T) {
ctx, cancel := context.WithCancel(context.Background())
workspace := t.TempDir()
err := os.WriteFile(filepath.Join(workspace, "ignored.star"), []byte{}, os.ModePerm)
if err != nil {
t.Fatalf(err.Error())
}
err = os.WriteFile(filepath.Join(workspace, "other-ignored.star"), []byte{}, os.ModePerm)
if err != nil {
t.Fatalf(err.Error())
}
cancel()
_, executionErr := verifyStarlark(ctx, workspace, buildifierLintCommand)
if executionErr == nil {
t.Fatalf("Expected execution error but got none")
}
})
})
t.Run("verification errors", func(t *testing.T) {
t.Run("a single file with lint", func(t *testing.T) {
ctx := context.Background()
workspace := t.TempDir()
invalidContent := []byte(`load("scripts/drone/other.star", "function")
function()`)
err := os.WriteFile(filepath.Join(workspace, "has-lint.star"), invalidContent, os.ModePerm)
if err != nil {
t.Fatalf(err.Error())
}
verificationErrs, executionErr := verifyStarlark(ctx, workspace, buildifierLintCommand)
if executionErr != nil {
t.Fatalf("Unexpected execution error: %v", executionErr)
}
if len(verificationErrs) == 0 {
t.Fatalf(`"has-lint.star" requires linting but the verifyStarlark function provided no linting error`)
}
if len(verificationErrs) > 1 {
t.Fatalf(`verifyStarlark returned multiple errors for the "has-lint.star" file but only one was expected: %v`, verificationErrs)
}
if !strings.Contains(verificationErrs[0].Error(), "has-lint.star:1: module-docstring: The file has no module docstring.") {
t.Fatalf(`"has-lint.star" is missing a module docstring but the verifyStarlark function linting error did not mention this, instead we got: %v`, verificationErrs[0])
}
})
t.Run("no files with lint", func(t *testing.T) {
ctx := context.Background()
workspace := t.TempDir()
content := []byte(`"""
This module does nothing.
"""
load("scripts/drone/other.star", "function")
function()
`)
require.NoError(t, os.WriteFile(filepath.Join(workspace, "no-lint.star"), content, os.ModePerm))
verificationErrs, executionErr := verifyStarlark(ctx, workspace, buildifierLintCommand)
if executionErr != nil {
t.Fatalf("Unexpected execution error: %v", executionErr)
}
if len(verificationErrs) != 0 {
t.Log(`"no-lint.star" has no lint but the verifyStarlark function provided at least one error`)
for _, err := range verificationErrs {
t.Log(err)
}
t.FailNow()
}
})
t.Run("multiple files with lint", func(t *testing.T) {
ctx := context.Background()
workspace := t.TempDir()
invalidContent := []byte(`load("scripts/drone/other.star", "function")
function()`)
require.NoError(t, os.WriteFile(filepath.Join(workspace, "has-lint.star"), invalidContent, os.ModePerm))
require.NoError(t, os.WriteFile(filepath.Join(workspace, "has-lint2.star"), invalidContent, os.ModePerm))
verificationErrs, executionErr := verifyStarlark(ctx, workspace, buildifierLintCommand)
if executionErr != nil {
t.Fatalf("Unexpected execution error: %v", executionErr)
}
if len(verificationErrs) == 0 {
t.Fatalf(`Two files require linting but the verifyStarlark function provided no linting error`)
}
if len(verificationErrs) == 1 {
t.Fatalf(`Two files require linting but the verifyStarlark function provided only one linting error: %v`, verificationErrs[0])
}
if len(verificationErrs) > 2 {
t.Fatalf(`verifyStarlark returned more errors than expected: %v`, verificationErrs)
}
if !strings.Contains(verificationErrs[0].Error(), "has-lint.star:1: module-docstring: The file has no module docstring.") {
t.Errorf(`"has-lint.star" is missing a module docstring but the verifyStarlark function linting error did not mention this, instead we got: %v`, verificationErrs[0])
}
if !strings.Contains(verificationErrs[1].Error(), "has-lint2.star:1: module-docstring: The file has no module docstring.") {
t.Fatalf(`"has-lint2.star" is missing a module docstring but the verifyStarlark function linting error did not mention this, instead we got: %v`, verificationErrs[0])
}
})
})
}

View File

@ -87,6 +87,12 @@ RUN curl -fLO http://storage.googleapis.com/grafana-downloads/ci-dependencies/sh
RUN echo $SHELLCHECK_CHKSUM shellcheck-v${SHELLCHECK_VERSION}.linux.x86_64.tar.xz | sha512sum --check --strict --status
RUN tar xf shellcheck-v${SHELLCHECK_VERSION}.linux.x86_64.tar.xz && mv shellcheck-v${SHELLCHECK_VERSION}/shellcheck /tmp/
ARG BUILDIFIER_VERSION=5.1.0
ARG BUILDIFIER_CHKSUM=52bf6b102cb4f88464e197caac06d69793fa2b05f5ad50a7e7bf6fbd656648a3
RUN curl -fLO https://github.com/bazelbuild/buildtools/releases/download/${BUILDIFIER_VERSION}/buildifier-linux-amd64
RUN echo $BUILDIFIER_CHKSUM buildifier-linux-amd64 | sha256sum --check --strict --status
RUN mv buildifier-linux-amd64 /tmp/buildifier && chmod +x /tmp/buildifier
ARG CUE_VERSION=0.3.0-alpha5
ARG CUE_CHKSUM=9d3131e470cdb5182afd9966688f1c052d383145cce005a947156b5591da39b7
RUN curl -fLO https://github.com/cuelang/cue/releases/download/v${CUE_VERSION}/cue_${CUE_VERSION}_Linux_x86_64.tar.gz
@ -113,6 +119,7 @@ ARG DEBIAN_FRONTEND=noninteractive
COPY --from=toolchain /tmp/x86_64-centos6-linux-gnu.tar.xz /tmp/osxcross.tar.xz /tmp/
COPY --from=toolchain /tmp/shellcheck /usr/local/bin/
COPY --from=toolchain /tmp/buildifier /usr/local/bin/
COPY --from=toolchain /tmp/cue /usr/local/bin/
COPY --from=toolchain /tmp/dockerize /usr/local/bin/

View File

@ -11,7 +11,7 @@ In order to build and publish the Grafana build Docker image, execute the follow
```
# Download MacOSX10.15.sdk.tar.xz from our private GCS bucket into this directory
docker build -t grafana/build-container:<VERSION> .
docker build -t grafana/build-container:<VERSION> --ulimit nofile=2048:2048 .
docker push grafana/build-container:<VERSION>
```

628
scripts/drone/TAGS Normal file
View File

@ -0,0 +1,628 @@
events/release.star,6652
ver_mode =ver_mode64,1602
release_trigger =release_trigger65,1623
def store_npm_packages_step():store_npm_packages_step74,1752
def retrieve_npm_packages_step():retrieve_npm_packages_step90,2193
def release_npm_packages_step():release_npm_packages_step107,2663
def oss_pipelines(ver_mode = ver_mode, trigger = release_trigger):oss_pipelines123,3076
environment =environment135,3492
edition =edition136,3529
services =services137,3549
volumes =volumes138,3609
package_steps =package_steps139,3659
publish_steps =publish_steps140,3682
should_publish =should_publish141,3705
should_upload =should_upload142,3748
init_steps =init_steps143,3818
build_steps =build_steps152,4033
integration_test_steps =integration_test_steps159,4342
build_storybook =build_storybook182,5254
publish_step =publish_step190,5674
store_npm_step =store_npm_step191,5758
windows_package_steps =windows_package_steps196,5957
windows_pipeline =windows_pipeline198,6044
name =name199,6077
edition =edition200,6127
trigger =trigger201,6154
steps =steps202,6181
platform =platform203,6256
depends_on =depends_on204,6286
environment =environment207,6393
pipelines =pipelines209,6434
name =name211,6470
edition =edition212,6550
trigger =trigger213,6581
services =services214,6612
steps =steps215,6639
environment =environment216,6717
volumes =volumes217,6756
name =name225,6970
edition =edition226,7038
trigger =trigger227,7073
services =services228,7108
steps =steps229,7145
environment =environment230,7329
volumes =volumes231,7372
deps =deps234,7433
def enterprise_pipelines(ver_mode = ver_mode, trigger = release_trigger):enterprise_pipelines247,7856
environment =environment259,8284
edition =edition260,8328
services =services261,8355
volumes =volumes262,8415
package_steps =package_steps263,8465
publish_steps =publish_steps264,8488
should_publish =should_publish265,8511
should_upload =should_upload266,8554
include_enterprise =include_enterprise267,8624
edition2 =edition2268,8673
init_steps =init_steps269,8702
build_steps =build_steps277,8909
integration_test_steps =integration_test_steps284,9218
build_storybook =build_storybook312,10299
publish_step =publish_step324,10892
store_npm_step =store_npm_step325,10976
windows_package_steps =windows_package_steps330,11175
step =step333,11284
deps_on_clone_enterprise_step =deps_on_clone_enterprise_step337,11418
windows_pipeline =windows_pipeline347,11746
name =name348,11779
edition =edition349,11836
trigger =trigger350,11863
steps =steps351,11890
platform =platform352,11965
depends_on =depends_on353,11995
environment =environment356,12109
pipelines =pipelines358,12150
name =name360,12186
edition =edition361,12273
trigger =trigger362,12304
services =services363,12335
steps =steps364,12362
environment =environment365,12440
volumes =volumes366,12479
name =name374,12711
edition =edition375,12786
trigger =trigger376,12821
services =services377,12856
steps =steps378,12893
environment =environment379,13213
volumes =volumes380,13256
deps =deps383,13317
def enterprise2_pipelines(prefix = "", ver_mode = ver_mode, trigger = release_trigger):enterprise2_pipelines397,13769
environment =environment412,14364
edition =edition415,14424
volumes =volumes416,14451
package_steps =package_steps417,14501
publish_steps =publish_steps418,14524
should_publish =should_publish419,14547
should_upload =should_upload420,14590
include_enterprise =include_enterprise421,14660
edition2 =edition2422,14709
init_steps =init_steps423,14738
build_steps =build_steps431,14945
fetch_images =fetch_images442,15355
upload_cdn =upload_cdn444,15497
step =step458,16187
deps_on_clone_enterprise_step =deps_on_clone_enterprise_step462,16321
pipelines =pipelines472,16608
name =name474,16644
edition =edition475,16742
trigger =trigger476,16773
services =services477,16804
steps =steps478,16831
volumes =volumes479,16909
environment =environment480,16940
def publish_artifacts_step(mode):publish_artifacts_step486,17019
security =security487,17053
security =security489,17098
def publish_artifacts_pipelines(mode):publish_artifacts_pipelines501,17538
trigger =trigger502,17577
steps =steps506,17655
name =name512,17768
trigger =trigger513,17820
steps =steps514,17847
edition =edition515,17870
environment =environment516,17895
def publish_packages_pipeline():publish_packages_pipeline519,17945
trigger =trigger526,18162
oss_steps =oss_steps530,18244
enterprise_steps =enterprise_steps538,18560
deps =deps545,18903
name =name552,19062
trigger =trigger553,19101
steps =steps554,19128
edition =edition555,19155
depends_on =depends_on556,19180
environment =environment557,19207
name =name559,19266
trigger =trigger560,19312
steps =steps561,19339
edition =edition562,19373
depends_on =depends_on563,19398
environment =environment564,19425
def publish_npm_pipelines(mode):publish_npm_pipelines567,19482
trigger =trigger568,19515
steps =steps572,19593
name =name580,19772
trigger =trigger581,19827
steps =steps582,19854
edition =edition583,19877
environment =environment584,19902
def artifacts_page_pipeline():artifacts_page_pipeline587,19952
trigger =trigger588,19983
name =name593,20087
trigger =trigger594,20128
steps =steps595,20155
edition =edition596,20220
environment =environment597,20245
def get_e2e_suffix():get_e2e_suffix600,20295
events/cron.star,1016
aquasec_trivy_image =aquasec_trivy_image8,209
def cronjobs(edition):cronjobs10,255
grafana_com_nightly_pipeline =grafana_com_nightly_pipeline11,278
cronName =cronName12,332
name =name13,374
steps =steps14,412
def cron_job_pipeline(cronName, name, steps):cron_job_pipeline24,773
def scan_docker_image_pipeline(edition, tag):scan_docker_image_pipeline43,1175
edition =edition55,1530
edition =edition57,1579
docker_image =docker_image59,1608
cronName =cronName62,1695
name =name63,1725
steps =steps64,1775
def scan_docker_image_unkown_low_medium_vulnerabilities_step(docker_image):scan_docker_image_unkown_low_medium_vulnerabilities_step71,2047
def scan_docker_image_high_critical_vulnerabilities_step(docker_image):scan_docker_image_high_critical_vulnerabilities_step80,2353
def slack_job_failed_step(channel, image):slack_job_failed_step89,2646
def post_to_grafana_com_step():post_to_grafana_com_step103,3069
events/main.star,633
ver_mode =ver_mode49,966
trigger =trigger50,984
def main_pipelines(edition):main_pipelines62,1168
drone_change_trigger =drone_change_trigger63,1197
pipelines =pipelines79,1513
name =name89,1951
slack_channel =slack_channel90,1994
trigger =trigger91,2045
template =template92,2089
secret =secret93,2135
name =name97,2276
slack_channel =slack_channel98,2310
trigger =trigger99,2366
depends_on =depends_on100,2425
template =template101,2563
secret =secret102,2604
events/pr.star,252
ver_mode =ver_mode48,997
trigger =trigger49,1013
def pr_pipelines(edition):pr_pipelines62,1198
def get_pr_trigger(include_paths = None, exclude_paths = None):get_pr_trigger76,2396
paths_ex =paths_ex91,3080
paths_in =paths_in92,3115
services/services.star,225
def integration_test_services_volumes():integration_test_services_volumes5,79
def integration_test_services(edition):integration_test_services14,292
services =services15,332
def ldap_service():ldap_service59,1616
utils/utils.star,561
failure_template =failure_template11,191
drone_change_template =drone_change_template12,509
services =services19,932
platform =platform20,955
depends_on =depends_on21,983
environment =environment22,1008
volumes =volumes23,1036
platform_conf =platform_conf50,2166
platform_conf =platform_conf62,2534
pipeline =pipeline70,2713
def notify_pipeline(name, slack_channel, trigger, depends_on = [], template = None, secret = None):notify_pipeline105,3545
trigger =trigger106,3645
pipelines/trigger_downstream.star,440
trigger =trigger14,249
def enterprise_downstream_pipeline(edition, ver_mode):enterprise_downstream_pipeline26,433
environment =environment27,488
steps =steps28,527
deps =deps29,587
name =name31,672
edition =edition32,714
trigger =trigger33,741
services =services34,768
steps =steps35,791
depends_on =depends_on36,814
environment =environment37,841
pipelines/verify_starlark.star,323
def verify_starlark(trigger, ver_mode):verify_starlark17,305
environment =environment18,345
steps =steps19,382
name =name26,546
edition =edition27,600
trigger =trigger28,625
services =services29,652
steps =steps30,675
environment =environment31,698
pipelines/build.star,508
def build_e2e(trigger, ver_mode, edition):build_e2e39,936
environment =environment50,1096
variants =variants51,1135
init_steps =init_steps52,1219
build_steps =build_steps61,1491
publish_suffix =publish_suffix107,4049
publish_suffix =publish_suffix109,4100
name =name112,4158
edition =edition113,4224
environment =environment114,4249
services =services115,4284
steps =steps116,4307
trigger =trigger117,4349
pipelines/shellcheck.star,386
trigger =trigger15,235
def shellcheck_step():shellcheck_step31,483
def shellcheck_pipeline():shellcheck_pipeline43,725
environment =environment44,752
steps =steps45,789
name =name50,886
edition =edition51,918
trigger =trigger52,943
services =services53,970
steps =steps54,993
environment =environment55,1016
pipelines/verify_drone.star,317
def verify_drone(trigger, ver_mode):verify_drone17,293
environment =environment18,330
steps =steps19,367
name =name26,528
edition =edition27,579
trigger =trigger28,604
services =services29,631
steps =steps30,654
environment =environment31,677
pipelines/test_backend.star,474
def test_backend(trigger, ver_mode, edition = "oss"):test_backend23,463
environment =environment35,882
init_steps =init_steps36,921
test_steps =test_steps46,1291
pipeline_name =pipeline_name51,1387
pipeline_name =pipeline_name53,1492
name =name55,1584
edition =edition56,1614
trigger =trigger57,1641
services =services58,1668
steps =steps59,1691
environment =environment60,1732
pipelines/lint_frontend.star,415
def lint_frontend_pipeline(trigger, ver_mode):lint_frontend_pipeline16,260
environment =environment26,546
yarn_step =yarn_step27,583
init_steps =init_steps29,660
test_steps =test_steps33,736
name =name37,812
edition =edition38,864
trigger =trigger39,889
services =services40,916
steps =steps41,939
environment =environment42,980
pipelines/docs.star,494
docs_paths =docs_paths19,383
def docs_pipelines(edition, ver_mode, trigger):docs_pipelines28,511
environment =environment29,559
steps =steps30,598
name =name40,815
edition =edition41,858
trigger =trigger42,885
services =services43,912
steps =steps44,935
environment =environment45,958
def lint_docs():lint_docs48,1000
def trigger_docs_main():trigger_docs_main63,1328
def trigger_docs_pr():trigger_docs_pr72,1478
pipelines/test_frontend.star,476
def test_frontend(trigger, ver_mode, edition = "oss"):test_frontend20,374
environment =environment32,794
init_steps =init_steps33,833
test_steps =test_steps41,1102
pipeline_name =pipeline_name45,1205
pipeline_name =pipeline_name47,1311
name =name49,1404
edition =edition50,1434
trigger =trigger51,1461
services =services52,1488
steps =steps53,1511
environment =environment54,1552
pipelines/integration_tests.star,483
def integration_tests(trigger, ver_mode, edition):integration_tests26,542
environment =environment37,900
services =services38,939
volumes =volumes39,989
init_steps =init_steps40,1039
test_steps =test_steps48,1282
name =name54,1412
edition =edition55,1468
trigger =trigger56,1493
services =services57,1520
steps =steps58,1549
environment =environment59,1590
volumes =volumes60,1625
pipelines/windows.star,954
def windows(trigger, edition, ver_mode):windows17,339
environment =environment29,798
init_cmds =init_cmds30,837
steps =steps38,1205
bucket =bucket49,1497
ver_part =ver_part51,1590
dir =dir52,1628
dir =dir54,1670
bucket =bucket55,1695
build_no =build_no56,1736
ver_part =ver_part57,1780
installer_commands =installer_commands58,1842
committish =committish100,3763
committish =committish102,3846
committish =committish104,3906
download_grabpl_step_cmds =download_grabpl_step_cmds107,4057
clone_cmds =clone_cmds113,4363
name =name146,5711
edition =edition147,5742
trigger =trigger148,5769
steps =steps149,5830
depends_on =depends_on150,5889
platform =platform151,6007
environment =environment152,6037
pipelines/lint_backend.star,418
def lint_backend_pipeline(trigger, ver_mode):lint_backend_pipeline18,306
environment =environment28,590
wire_step =wire_step29,627
init_steps =init_steps31,704
test_steps =test_steps36,809
name =name43,959
edition =edition44,1010
trigger =trigger45,1035
services =services46,1062
steps =steps47,1085
environment =environment48,1126
pipelines/publish_images.star,998
def publish_image_steps(edition, mode, docker_repo):publish_image_steps17,303
additional_docker_repo =additional_docker_repo31,922
additional_docker_repo =additional_docker_repo33,979
steps =steps34,1034
def publish_image_pipelines_public():publish_image_pipelines_public45,1369
mode =mode51,1521
trigger =trigger52,1541
name =name57,1641
trigger =trigger58,1694
steps =steps59,1721
edition =edition60,1813
environment =environment61,1835
name =name63,1894
trigger =trigger64,1954
steps =steps65,1981
edition =edition66,2091
environment =environment67,2113
def publish_image_pipelines_security():publish_image_pipelines_security70,2170
mode =mode71,2210
trigger =trigger72,2232
name =name77,2332
trigger =trigger78,2392
steps =steps79,2419
edition =edition80,2529
environment =environment81,2551
steps/lib.star,8579
grabpl_version =grabpl_version7,181
build_image =build_image8,208
publish_image =publish_image9,254
deploy_docker_image =deploy_docker_image10,304
alpine_image =alpine_image11,380
curl_image =curl_image12,411
windows_image =windows_image13,452
wix_image =wix_image14,501
go_image =go_image15,536
disable_tests =disable_tests17,564
trigger_oss =trigger_oss18,586
def slack_step(channel, template, secret):slack_step24,653
def yarn_install_step(edition = "oss"):yarn_install_step35,918
deps =deps36,958
deps =deps38,1004
def wire_install_step():wire_install_step48,1222
def identify_runner_step(platform = "linux"):identify_runner_step60,1454
def clone_enterprise_step(ver_mode):clone_enterprise_step78,1916
committish =committish87,2193
committish =committish89,2268
committish =committish91,2317
def init_enterprise_step(ver_mode):init_enterprise_step105,2747
source_commit =source_commit115,3098
source_commit =source_commit117,3151
environment =environment118,3191
token =token121,3280
environment =environment123,3369
token =token126,3458
environment =environment128,3518
token =token129,3543
def download_grabpl_step(platform = "linux"):download_grabpl_step148,4147
def lint_drone_step():lint_drone_step173,4973
def lint_starlark_step():lint_starlark_step185,5216
def enterprise_downstream_step(edition, ver_mode):enterprise_downstream_step206,6000
repo =repo219,6482
step =step225,6623
def lint_backend_step():lint_backend_step247,7248
def benchmark_ldap_step():benchmark_ldap_step265,7713
def build_storybook_step(edition, ver_mode):build_storybook_step278,8087
def store_storybook_step(edition, ver_mode, trigger = None):store_storybook_step300,8743
commands =commands314,9202
commands =commands323,9521
step =step325,9593
when_cond =when_cond338,10125
step =step346,10330
def e2e_tests_artifacts(edition):e2e_tests_artifacts349,10391
def upload_cdn_step(edition, ver_mode, trigger = None):upload_cdn_step386,12378
deps =deps397,12763
step =step407,12970
step =step420,13423
def build_backend_step(edition, ver_mode, variants = None):build_backend_step423,13482
variants_str =variants_str437,14070
variants_str =variants_str439,14109
cmds =cmds443,14256
build_no =build_no449,14418
cmds =cmds450,14461
def build_frontend_step(edition, ver_mode):build_frontend_step468,14906
build_no =build_no478,15246
cmds =cmds482,15356
cmds =cmds487,15505
def build_frontend_package_step(edition, ver_mode):build_frontend_package_step505,15960
build_no =build_no515,16312
cmds =cmds519,16422
cmds =cmds524,16580
def build_plugins_step(edition, ver_mode):build_plugins_step542,17053
env =env544,17121
env =env548,17220
def test_backend_step():test_backend_step563,17607
def test_backend_integration_step():test_backend_integration_step575,17880
def betterer_frontend_step(edition = "oss"):betterer_frontend_step587,18187
deps =deps596,18427
def test_frontend_step(edition = "oss"):test_frontend_step609,18728
deps =deps618,18962
def lint_frontend_step():lint_frontend_step634,19343
def test_a11y_frontend_step(ver_mode, edition, port = 3001):test_a11y_frontend_step652,19793
commands =commands664,20279
failure =failure667,20345
failure =failure672,20483
def frontend_metrics_step(edition, trigger = None):frontend_metrics_step693,21146
step =step706,21507
step =step721,22007
def codespell_step():codespell_step724,22066
def package_step(edition, ver_mode, variants = None):package_step736,22468
deps =deps750,23006
variants_str =variants_str757,23167
variants_str =variants_str759,23206
sign_args =sign_args762,23332
env =env763,23362
test_args =test_args769,23628
sign_args =sign_args771,23661
env =env772,23684
test_args =test_args773,23703
cmds =cmds777,23829
build_no =build_no784,24036
cmds =cmds785,24079
def grafana_server_step(edition, port = 3001):grafana_server_step798,24459
package_file_pfx =package_file_pfx808,24729
package_file_pfx =package_file_pfx810,24788
package_file_pfx =package_file_pfx812,24889
environment =environment814,24938
def e2e_tests_step(suite, edition, port = 3001, tries = None):e2e_tests_step837,25554
cmd =cmd838,25617
def cloud_plugins_e2e_tests_step(suite, edition, cloud, trigger = None):cloud_plugins_e2e_tests_step856,26186
environment =environment869,26649
when =when870,26670
when =when872,26700
environment =environment874,26748
when =when882,27129
branch =branch888,27345
step =step889,27401
step =step901,27822
def build_docs_website_step():build_docs_website_step904,27874
def copy_packages_for_docker_step(edition = None):copy_packages_for_docker_step916,28272
def build_docker_images_step(edition, archs = None, ubuntu = False, publish = False):build_docker_images_step929,28622
cmd =cmd943,29193
ubuntu_sfx =ubuntu_sfx947,29307
ubuntu_sfx =ubuntu_sfx949,29342
environment =environment955,29468
def fetch_images_step(edition):fetch_images_step979,30079
def publish_images_step(edition, ver_mode, mode, docker_repo, trigger = None):publish_images_step997,30745
name =name1013,31562
docker_repo =docker_repo1014,31585
mode =mode1016,31663
mode =mode1018,31709
environment =environment1020,31728
cmd =cmd1026,31912
deps =deps1029,32041
deps =deps1032,32147
name =name1035,32250
docker_repo =docker_repo1036,32273
cmd =cmd1038,32459
step =step1040,32565
step =step1052,32929
def postgres_integration_tests_step():postgres_integration_tests_step1056,32989
cmds =cmds1057,33028
def mysql_integration_tests_step():mysql_integration_tests_step1079,33850
cmds =cmds1080,33886
def redis_integration_tests_step():redis_integration_tests_step1100,34629
def memcached_integration_tests_step():memcached_integration_tests_step1114,35026
def release_canary_npm_packages_step(edition, trigger = None):release_canary_npm_packages_step1128,35435
step =step1141,35805
step =step1153,36143
def enterprise2_suffix(edition):enterprise2_suffix1156,36202
def upload_packages_step(edition, ver_mode, trigger = None):upload_packages_step1161,36320
deps =deps1176,36816
step =step1184,37036
step =step1195,37471
def publish_grafanacom_step(edition, ver_mode):publish_grafanacom_step1198,37530
cmd =cmd1211,38044
build_no =build_no1215,38188
cmd =cmd1216,38231
def publish_linux_packages_step(edition, package_manager = "deb"):publish_linux_packages_step1239,38866
def get_windows_steps(edition, ver_mode):get_windows_steps1261,39989
init_cmds =init_cmds1270,40281
steps =steps1278,40649
bucket =bucket1289,40941
ver_part =ver_part1291,41034
dir =dir1292,41072
dir =dir1294,41114
bucket =bucket1295,41139
build_no =build_no1296,41180
ver_part =ver_part1297,41224
installer_commands =installer_commands1298,41286
committish =committish1340,43207
committish =committish1342,43290
committish =committish1344,43350
download_grabpl_step_cmds =download_grabpl_step_cmds1347,43501
clone_cmds =clone_cmds1353,43807
def verify_gen_cue_step(edition):verify_gen_cue_step1387,45152
deps =deps1388,45186
def verify_gen_jsonnet_step(edition):verify_gen_jsonnet_step1402,45694
deps =deps1403,45732
def trigger_test_release():trigger_test_release1417,46236
def artifacts_page_step():artifacts_page_step1451,47731
def end_to_end_tests_deps():end_to_end_tests_deps1466,48058
def compile_build_cmd(edition = "oss"):compile_build_cmd1476,48321
dependencies =dependencies1477,48361
dependencies =dependencies1479,48432
def get_trigger_storybook(ver_mode):get_trigger_storybook1492,48780
trigger_storybook =trigger_storybook1500,49031
trigger_storybook =trigger_storybook1502,49088
trigger_storybook =trigger_storybook1506,49168
vault.star,444
pull_secret =pull_secret4,87
github_token =github_token5,120
drone_token =drone_token6,150
prerelease_bucket =prerelease_bucket7,178
gcp_upload_artifacts_key =gcp_upload_artifacts_key8,218
azure_sp_app_id =azure_sp_app_id9,272
azure_sp_app_pw =azure_sp_app_pw10,308
azure_tenant =azure_tenant11,344
def from_secret(secret):from_secret13,375
def vault_secret(name, path, key):vault_secret18,451
def secrets():secrets28,633
version.star,116
ver_mode =ver_mode12,197
trigger =trigger13,225
def version_branch_pipelines():version_branch_pipelines15,268

View File

@ -1,110 +1,114 @@
load('scripts/drone/vault.star', 'from_secret')
"""
This module provides functions for cronjob pipelines and steps used within.
"""
load("scripts/drone/vault.star", "from_secret")
load(
'scripts/drone/steps/lib.star',
'publish_image',
'compile_build_cmd',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"publish_image",
)
aquasec_trivy_image = 'aquasec/trivy:0.21.0'
aquasec_trivy_image = "aquasec/trivy:0.21.0"
def cronjobs():
return [
scan_docker_image_pipeline('latest'),
scan_docker_image_pipeline('main'),
scan_docker_image_pipeline('latest-ubuntu'),
scan_docker_image_pipeline('main-ubuntu'),
scan_docker_image_pipeline("latest"),
scan_docker_image_pipeline("main"),
scan_docker_image_pipeline("latest-ubuntu"),
scan_docker_image_pipeline("main-ubuntu"),
grafana_com_nightly_pipeline(),
]
def cron_job_pipeline(cronName, name, steps):
return {
'kind': 'pipeline',
'type': 'docker',
'platform': {
'os': 'linux',
'arch': 'amd64',
"kind": "pipeline",
"type": "docker",
"platform": {
"os": "linux",
"arch": "amd64",
},
'name': name,
'trigger': {
'event': 'cron',
'cron': cronName,
"name": name,
"trigger": {
"event": "cron",
"cron": cronName,
},
'clone': {
'retries': 3,
"clone": {
"retries": 3,
},
'steps': steps,
"steps": steps,
}
def scan_docker_image_pipeline(tag):
dockerImage = 'grafana/{}:{}'.format('grafana', tag)
"""Generates a cronjob pipeline for nightly scans of grafana Docker images.
Args:
tag: determines which image tag is scanned.
Returns:
Drone cronjob pipeline.
"""
docker_image = "grafana/grafana:{}".format(tag)
return cron_job_pipeline(
cronName='nightly',
name='scan-' + dockerImage + '-image',
steps=[
scan_docker_image_unkown_low_medium_vulnerabilities_step(dockerImage),
scan_docker_image_high_critical_vulnerabilities_step(dockerImage),
slack_job_failed_step('grafana-backend-ops', dockerImage),
cronName = "nightly",
name = "scan-" + docker_image + "-image",
steps = [
scan_docker_image_unkown_low_medium_vulnerabilities_step(docker_image),
scan_docker_image_high_critical_vulnerabilities_step(docker_image),
slack_job_failed_step("grafana-backend-ops", docker_image),
],
)
def scan_docker_image_unkown_low_medium_vulnerabilities_step(dockerImage):
def scan_docker_image_unkown_low_medium_vulnerabilities_step(docker_image):
return {
'name': 'scan-unkown-low-medium-vulnerabilities',
'image': aquasec_trivy_image,
'commands': [
'trivy --exit-code 0 --severity UNKNOWN,LOW,MEDIUM ' + dockerImage,
"name": "scan-unkown-low-medium-vulnerabilities",
"image": aquasec_trivy_image,
"commands": [
"trivy --exit-code 0 --severity UNKNOWN,LOW,MEDIUM " + docker_image,
],
}
def scan_docker_image_high_critical_vulnerabilities_step(dockerImage):
def scan_docker_image_high_critical_vulnerabilities_step(docker_image):
return {
'name': 'scan-high-critical-vulnerabilities',
'image': aquasec_trivy_image,
'commands': [
'trivy --exit-code 1 --severity HIGH,CRITICAL ' + dockerImage,
"name": "scan-high-critical-vulnerabilities",
"image": aquasec_trivy_image,
"commands": [
"trivy --exit-code 1 --severity HIGH,CRITICAL " + docker_image,
],
}
def slack_job_failed_step(channel, image):
return {
'name': 'slack-notify-failure',
'image': 'plugins/slack',
'settings': {
'webhook': from_secret('slack_webhook_backend'),
'channel': channel,
'template': 'Nightly docker image scan job for '
+ image
+ ' failed: {{build.link}}',
"name": "slack-notify-failure",
"image": "plugins/slack",
"settings": {
"webhook": from_secret("slack_webhook_backend"),
"channel": channel,
"template": "Nightly docker image scan job for " +
image +
" failed: {{build.link}}",
},
'when': {'status': 'failure'},
"when": {"status": "failure"},
}
def post_to_grafana_com_step():
return {
'name': 'post-to-grafana-com',
'image': publish_image,
'environment': {
'GRAFANA_COM_API_KEY': from_secret('grafana_api_key'),
'GCP_KEY': from_secret('gcp_key'),
"name": "post-to-grafana-com",
"image": publish_image,
"environment": {
"GRAFANA_COM_API_KEY": from_secret("grafana_api_key"),
"GCP_KEY": from_secret("gcp_key"),
},
'depends_on': ['compile-build-cmd'],
'commands': ['./bin/build publish grafana-com --edition oss'],
"depends_on": ["compile-build-cmd"],
"commands": ["./bin/build publish grafana-com --edition oss"],
}
def grafana_com_nightly_pipeline():
return cron_job_pipeline(
cronName='grafana-com-nightly',
name='grafana-com-nightly',
steps=[
cronName = "grafana-com-nightly",
name = "grafana-com-nightly",
steps = [
compile_build_cmd(),
post_to_grafana_com_step(),
],

View File

@ -1,125 +1,115 @@
load(
'scripts/drone/utils/utils.star',
'pipeline',
'notify_pipeline',
'failure_template',
'drone_change_template',
)
"""
This module returns all the pipelines used in the event of pushes to the main branch.
"""
load(
'scripts/drone/pipelines/docs.star',
'docs_pipelines',
'trigger_docs_main',
"scripts/drone/utils/utils.star",
"drone_change_template",
"failure_template",
"notify_pipeline",
)
load(
'scripts/drone/pipelines/test_frontend.star',
'test_frontend',
"scripts/drone/pipelines/docs.star",
"docs_pipelines",
"trigger_docs_main",
)
load(
'scripts/drone/pipelines/test_backend.star',
'test_backend',
"scripts/drone/pipelines/test_frontend.star",
"test_frontend",
)
load(
'scripts/drone/pipelines/integration_tests.star',
'integration_tests',
"scripts/drone/pipelines/test_backend.star",
"test_backend",
)
load(
'scripts/drone/pipelines/build.star',
'build_e2e',
"scripts/drone/pipelines/integration_tests.star",
"integration_tests",
)
load(
'scripts/drone/pipelines/windows.star',
'windows',
"scripts/drone/pipelines/build.star",
"build_e2e",
)
load(
'scripts/drone/pipelines/trigger_downstream.star',
'enterprise_downstream_pipeline',
"scripts/drone/pipelines/windows.star",
"windows",
)
load(
'scripts/drone/pipelines/lint_backend.star',
'lint_backend_pipeline',
"scripts/drone/pipelines/trigger_downstream.star",
"enterprise_downstream_pipeline",
)
load(
'scripts/drone/pipelines/lint_frontend.star',
'lint_frontend_pipeline',
"scripts/drone/pipelines/lint_backend.star",
"lint_backend_pipeline",
)
load(
"scripts/drone/pipelines/lint_frontend.star",
"lint_frontend_pipeline",
)
load('scripts/drone/vault.star', 'from_secret')
ver_mode = 'main'
ver_mode = "main"
trigger = {
'event': [
'push',
"event": [
"push",
],
'branch': 'main',
'paths': {
'exclude': [
'*.md',
'docs/**',
'latest.json',
"branch": "main",
"paths": {
"exclude": [
"*.md",
"docs/**",
"latest.json",
],
},
}
def main_pipelines():
drone_change_trigger = {
'event': [
'push',
"event": [
"push",
],
'branch': 'main',
'repo': [
'grafana/grafana',
"branch": "main",
"repo": [
"grafana/grafana",
],
'paths': {
'include': [
'.drone.yml',
"paths": {
"include": [
".drone.yml",
],
'exclude': [
'exclude',
"exclude": [
"exclude",
],
},
}
pipelines = [
docs_pipelines(ver_mode, trigger_docs_main()),
test_frontend(trigger, ver_mode, committish='${DRONE_COMMIT}'),
test_frontend(trigger, ver_mode),
lint_frontend_pipeline(trigger, ver_mode),
test_backend(trigger, ver_mode, committish='${DRONE_COMMIT}'),
test_backend(trigger, ver_mode),
lint_backend_pipeline(trigger, ver_mode),
build_e2e(trigger, ver_mode),
integration_tests(trigger, prefix=ver_mode),
windows(trigger, edition='oss', ver_mode=ver_mode),
integration_tests(trigger, prefix = ver_mode),
windows(trigger, edition = "oss", ver_mode = ver_mode),
notify_pipeline(
name='notify-drone-changes',
slack_channel='slack-webhooks-test',
trigger=drone_change_trigger,
template=drone_change_template,
secret='drone-changes-webhook',
name = "notify-drone-changes",
slack_channel = "slack-webhooks-test",
trigger = drone_change_trigger,
template = drone_change_template,
secret = "drone-changes-webhook",
),
enterprise_downstream_pipeline(),
notify_pipeline(
name='main-notify',
slack_channel='grafana-ci-notifications',
trigger=dict(trigger, status=['failure']),
depends_on=[
'main-test-frontend',
'main-test-backend',
'main-build-e2e-publish',
'main-integration-tests',
'main-windows',
name = "main-notify",
slack_channel = "grafana-ci-notifications",
trigger = dict(trigger, status = ["failure"]),
depends_on = [
"main-test-frontend",
"main-test-backend",
"main-build-e2e-publish",
"main-integration-tests",
"main-windows",
],
template=failure_template,
secret='slack_webhook',
template = failure_template,
secret = "slack_webhook",
),
]

View File

@ -1,143 +1,155 @@
load(
'scripts/drone/utils/utils.star',
'pipeline',
)
"""
This module returns all pipelines used in the event of a pull request.
It also includes a function generating a PR trigger from a list of included and excluded paths.
"""
load(
'scripts/drone/pipelines/test_frontend.star',
'test_frontend',
"scripts/drone/pipelines/test_frontend.star",
"test_frontend",
)
load(
'scripts/drone/pipelines/test_backend.star',
'test_backend',
"scripts/drone/pipelines/test_backend.star",
"test_backend",
)
load(
'scripts/drone/pipelines/integration_tests.star',
'integration_tests',
"scripts/drone/pipelines/integration_tests.star",
"integration_tests",
)
load(
'scripts/drone/pipelines/build.star',
'build_e2e',
"scripts/drone/pipelines/build.star",
"build_e2e",
)
load(
'scripts/drone/pipelines/verify_drone.star',
'verify_drone',
"scripts/drone/pipelines/verify_drone.star",
"verify_drone",
)
load(
'scripts/drone/pipelines/docs.star',
'docs_pipelines',
'trigger_docs_pr',
"scripts/drone/pipelines/verify_starlark.star",
"verify_starlark",
)
load(
'scripts/drone/pipelines/shellcheck.star',
'shellcheck_pipeline',
"scripts/drone/pipelines/docs.star",
"docs_pipelines",
"trigger_docs_pr",
)
load(
'scripts/drone/pipelines/lint_backend.star',
'lint_backend_pipeline',
"scripts/drone/pipelines/shellcheck.star",
"shellcheck_pipeline",
)
load(
'scripts/drone/pipelines/lint_frontend.star',
'lint_frontend_pipeline',
"scripts/drone/pipelines/lint_backend.star",
"lint_backend_pipeline",
)
load(
"scripts/drone/pipelines/lint_frontend.star",
"lint_frontend_pipeline",
)
ver_mode = 'pr'
ver_mode = "pr"
trigger = {
'event': [
'pull_request',
"event": [
"pull_request",
],
'paths': {
'exclude': [
'*.md',
'docs/**',
'latest.json',
"paths": {
"exclude": [
"*.md",
"docs/**",
"latest.json",
],
},
}
def pr_pipelines():
return [
verify_drone(
get_pr_trigger(
include_paths=['scripts/drone/**', '.drone.yml', '.drone.star']
include_paths = ["scripts/drone/**", ".drone.yml", ".drone.star"],
),
ver_mode,
),
verify_starlark(
get_pr_trigger(
include_paths = ["scripts/drone/**", ".drone.star"],
),
ver_mode,
),
test_frontend(
get_pr_trigger(
exclude_paths=['pkg/**', 'packaging/**', 'go.sum', 'go.mod']
exclude_paths = ["pkg/**", "packaging/**", "go.sum", "go.mod"],
),
ver_mode,
committish='${DRONE_COMMIT}',
),
lint_frontend_pipeline(
get_pr_trigger(
exclude_paths=['pkg/**', 'packaging/**', 'go.sum', 'go.mod']
exclude_paths = ["pkg/**", "packaging/**", "go.sum", "go.mod"],
),
ver_mode,
),
test_backend(
get_pr_trigger(
include_paths=[
'pkg/**',
'packaging/**',
'.drone.yml',
'conf/**',
'go.sum',
'go.mod',
'public/app/plugins/**/plugin.json',
'devenv/**',
]
include_paths = [
"pkg/**",
"packaging/**",
".drone.yml",
"conf/**",
"go.sum",
"go.mod",
"public/app/plugins/**/plugin.json",
"devenv/**",
],
),
ver_mode,
committish='${DRONE_COMMIT}',
),
lint_backend_pipeline(
get_pr_trigger(
include_paths=[
'pkg/**',
'packaging/**',
'conf/**',
'go.sum',
'go.mod',
'public/app/plugins/**/plugin.json',
'devenv/**',
'.bingo/**',
]
include_paths = [
"pkg/**",
"packaging/**",
"conf/**",
"go.sum",
"go.mod",
"public/app/plugins/**/plugin.json",
"devenv/**",
".bingo/**",
],
),
ver_mode,
),
build_e2e(trigger, ver_mode),
integration_tests(
get_pr_trigger(
include_paths=[
'pkg/**',
'packaging/**',
'.drone.yml',
'conf/**',
'go.sum',
'go.mod',
'public/app/plugins/**/plugin.json',
]
include_paths = [
"pkg/**",
"packaging/**",
".drone.yml",
"conf/**",
"go.sum",
"go.mod",
"public/app/plugins/**/plugin.json",
],
),
prefix=ver_mode,
prefix = ver_mode,
),
docs_pipelines(ver_mode, trigger_docs_pr()),
shellcheck_pipeline(),
]
def get_pr_trigger(include_paths = None, exclude_paths = None):
"""Generates a trigger filter from the lists of included and excluded path patterns.
def get_pr_trigger(include_paths=None, exclude_paths=None):
paths_ex = ['docs/**', '*.md']
This function is primarily intended to generate a trigger for code changes
as the patterns 'docs/**' and '*.md' are always excluded.
Args:
include_paths: a list of path patterns using the same syntax as gitignore.
Changes affecting files matching these path patterns trigger the pipeline.
exclude_paths: a list of path patterns using the same syntax as gitignore.
Changes affecting files matching these path patterns do not trigger the pipeline.
Returns:
Drone trigger.
"""
paths_ex = ["docs/**", "*.md"]
paths_in = []
if include_paths:
for path in include_paths:
@ -146,11 +158,11 @@ def get_pr_trigger(include_paths=None, exclude_paths=None):
for path in exclude_paths:
paths_ex.extend([path])
return {
'event': [
'pull_request',
"event": [
"pull_request",
],
'paths': {
'exclude': paths_ex,
'include': paths_in,
"paths": {
"exclude": paths_ex,
"include": paths_in,
},
}

View File

@ -1,155 +1,137 @@
load(
'scripts/drone/steps/lib.star',
'artifacts_page_step',
'benchmark_ldap_step',
'build_backend_step',
'build_docker_images_step',
'build_frontend_package_step',
'build_frontend_step',
'build_image',
'build_plugins_step',
'build_storybook_step',
'clone_enterprise_step',
'compile_build_cmd',
'copy_packages_for_docker_step',
'download_grabpl_step',
'e2e_tests_artifacts',
'e2e_tests_step',
'fetch_images_step',
'get_windows_steps',
'grafana_server_step',
'identify_runner_step',
'init_enterprise_step',
'lint_backend_step',
'lint_drone_step',
'lint_frontend_step',
'memcached_integration_tests_step',
'mysql_integration_tests_step',
'package_step',
'postgres_integration_tests_step',
'publish_grafanacom_step',
'publish_image',
'publish_images_step',
'publish_linux_packages_step',
'redis_integration_tests_step',
'store_storybook_step',
'test_backend_integration_step',
'test_backend_step',
'test_frontend_step',
'trigger_oss',
'upload_cdn_step',
'upload_packages_step',
'verify_gen_cue_step',
'verify_gen_jsonnet_step',
'wire_install_step',
'yarn_install_step',
)
"""
This module returns all the pipelines used in the event of a release along with supporting functions.
"""
load(
'scripts/drone/services/services.star',
'integration_test_services',
'integration_test_services_volumes',
'ldap_service',
"scripts/drone/steps/lib.star",
"artifacts_page_step",
"build_backend_step",
"build_docker_images_step",
"build_frontend_package_step",
"build_frontend_step",
"build_image",
"build_plugins_step",
"build_storybook_step",
"clone_enterprise_step",
"compile_build_cmd",
"copy_packages_for_docker_step",
"download_grabpl_step",
"e2e_tests_artifacts",
"e2e_tests_step",
"fetch_images_step",
"get_windows_steps",
"grafana_server_step",
"identify_runner_step",
"init_enterprise_step",
"memcached_integration_tests_step",
"mysql_integration_tests_step",
"package_step",
"postgres_integration_tests_step",
"publish_grafanacom_step",
"publish_image",
"publish_images_step",
"publish_linux_packages_step",
"redis_integration_tests_step",
"store_storybook_step",
"trigger_oss",
"upload_cdn_step",
"upload_packages_step",
"verify_gen_cue_step",
"verify_gen_jsonnet_step",
"wire_install_step",
"yarn_install_step",
)
load(
'scripts/drone/utils/utils.star',
'pipeline',
'notify_pipeline',
'failure_template',
'drone_change_template',
'with_deps',
"scripts/drone/services/services.star",
"integration_test_services",
"integration_test_services_volumes",
)
load(
'scripts/drone/pipelines/test_frontend.star',
'test_frontend',
'test_frontend_enterprise',
"scripts/drone/utils/utils.star",
"pipeline",
"with_deps",
)
load(
'scripts/drone/pipelines/test_backend.star',
'test_backend',
'test_backend_enterprise',
"scripts/drone/pipelines/test_frontend.star",
"test_frontend",
"test_frontend_enterprise",
)
load(
'scripts/drone/vault.star',
'from_secret',
'pull_secret',
'drone_token',
'prerelease_bucket',
"scripts/drone/pipelines/test_backend.star",
"test_backend",
"test_backend_enterprise",
)
load("scripts/drone/vault.star", "from_secret", "prerelease_bucket")
ver_mode = 'release'
ver_mode = "release"
release_trigger = {
'event': {'exclude': ['promote']},
'ref': [
'refs/tags/v*',
"event": {"exclude": ["promote"]},
"ref": [
"refs/tags/v*",
],
}
def store_npm_packages_step():
return {
'name': 'store-npm-packages',
'image': build_image,
'depends_on': [
'compile-build-cmd',
'build-frontend-packages',
"name": "store-npm-packages",
"image": build_image,
"depends_on": [
"compile-build-cmd",
"build-frontend-packages",
],
'environment': {
'GCP_KEY': from_secret('gcp_key'),
'PRERELEASE_BUCKET': from_secret(prerelease_bucket),
"environment": {
"GCP_KEY": from_secret("gcp_key"),
"PRERELEASE_BUCKET": from_secret(prerelease_bucket),
},
'commands': ['./bin/build artifacts npm store --tag ${DRONE_TAG}'],
"commands": ["./bin/build artifacts npm store --tag ${DRONE_TAG}"],
}
def retrieve_npm_packages_step():
return {
'name': 'retrieve-npm-packages',
'image': publish_image,
'depends_on': [
'compile-build-cmd',
'yarn-install',
"name": "retrieve-npm-packages",
"image": publish_image,
"depends_on": [
"compile-build-cmd",
"yarn-install",
],
'failure': 'ignore',
'environment': {
'GCP_KEY': from_secret('gcp_key'),
'PRERELEASE_BUCKET': from_secret(prerelease_bucket),
"failure": "ignore",
"environment": {
"GCP_KEY": from_secret("gcp_key"),
"PRERELEASE_BUCKET": from_secret(prerelease_bucket),
},
'commands': ['./bin/build artifacts npm retrieve --tag ${DRONE_TAG}'],
"commands": ["./bin/build artifacts npm retrieve --tag ${DRONE_TAG}"],
}
def release_npm_packages_step():
return {
'name': 'release-npm-packages',
'image': build_image,
'depends_on': [
'compile-build-cmd',
'retrieve-npm-packages',
"name": "release-npm-packages",
"image": build_image,
"depends_on": [
"compile-build-cmd",
"retrieve-npm-packages",
],
'failure': 'ignore',
'environment': {
'NPM_TOKEN': from_secret('npm_token'),
"failure": "ignore",
"environment": {
"NPM_TOKEN": from_secret("npm_token"),
},
'commands': ['./bin/build artifacts npm release --tag ${DRONE_TAG}'],
"commands": ["./bin/build artifacts npm release --tag ${DRONE_TAG}"],
}
def oss_pipelines(ver_mode = ver_mode, trigger = release_trigger):
"""Generates all pipelines used for Grafana OSS.
def oss_pipelines(ver_mode=ver_mode, trigger=release_trigger):
if ver_mode == 'release':
committish = '${DRONE_TAG}'
elif ver_mode == 'release-branch':
committish = '${DRONE_BRANCH}'
else:
committish = '${DRONE_COMMIT}'
Args:
ver_mode: controls which steps are included in the pipeline.
Defaults to 'release'.
trigger: controls which events can trigger the pipeline execution.
Defaults to tag events for tags with a 'v' prefix.
environment = {'EDITION': 'oss'}
Returns:
List of Drone pipelines.
"""
environment = {"EDITION": "oss"}
services = integration_test_services(edition='oss')
services = integration_test_services(edition = "oss")
volumes = integration_test_services_volumes()
init_steps = [
@ -162,46 +144,50 @@ def oss_pipelines(ver_mode=ver_mode, trigger=release_trigger):
]
build_steps = [
build_backend_step(edition='oss', ver_mode=ver_mode),
build_frontend_step(edition='oss', ver_mode=ver_mode),
build_frontend_package_step(edition='oss', ver_mode=ver_mode),
build_plugins_step(edition='oss', ver_mode=ver_mode),
package_step(edition='oss', ver_mode=ver_mode),
build_backend_step(edition = "oss", ver_mode = ver_mode),
build_frontend_step(edition = "oss", ver_mode = ver_mode),
build_frontend_package_step(edition = "oss", ver_mode = ver_mode),
build_plugins_step(edition = "oss", ver_mode = ver_mode),
package_step(edition = "oss", ver_mode = ver_mode),
copy_packages_for_docker_step(),
build_docker_images_step(edition='oss', ver_mode=ver_mode, publish=True),
build_docker_images_step(edition = "oss", publish = True),
build_docker_images_step(
edition='oss', ver_mode=ver_mode, publish=True, ubuntu=True
edition = "oss",
publish = True,
ubuntu = True,
),
grafana_server_step(edition='oss'),
e2e_tests_step('dashboards-suite', tries=3),
e2e_tests_step('smoke-tests-suite', tries=3),
e2e_tests_step('panels-suite', tries=3),
e2e_tests_step('various-suite', tries=3),
grafana_server_step(edition = "oss"),
e2e_tests_step("dashboards-suite", tries = 3),
e2e_tests_step("smoke-tests-suite", tries = 3),
e2e_tests_step("panels-suite", tries = 3),
e2e_tests_step("various-suite", tries = 3),
e2e_tests_artifacts(),
build_storybook_step(ver_mode=ver_mode),
build_storybook_step(ver_mode = ver_mode),
]
publish_steps = []
if ver_mode in (
'release',
'release-branch',
"release",
"release-branch",
):
publish_steps.extend(
[
upload_cdn_step(edition='oss', ver_mode=ver_mode, trigger=trigger_oss),
upload_cdn_step(edition = "oss", ver_mode = ver_mode, trigger = trigger_oss),
upload_packages_step(
edition='oss', ver_mode=ver_mode, trigger=trigger_oss
edition = "oss",
ver_mode = ver_mode,
trigger = trigger_oss,
),
]
],
)
if ver_mode in ('release',):
if ver_mode in ("release",):
publish_steps.extend(
[
store_storybook_step(ver_mode=ver_mode),
store_storybook_step(ver_mode = ver_mode),
store_npm_packages_step(),
]
],
)
integration_test_steps = [
@ -210,74 +196,84 @@ def oss_pipelines(ver_mode=ver_mode, trigger=release_trigger):
]
windows_pipeline = pipeline(
name='{}-oss-windows'.format(ver_mode),
edition='oss',
trigger=trigger,
steps=get_windows_steps(edition='oss', ver_mode=ver_mode),
platform='windows',
depends_on=[
name = "{}-oss-windows".format(ver_mode),
edition = "oss",
trigger = trigger,
steps = get_windows_steps(edition = "oss", ver_mode = ver_mode),
platform = "windows",
depends_on = [
# 'oss-build-e2e-publish-{}'.format(ver_mode),
'{}-oss-build-e2e-publish'.format(ver_mode),
'{}-oss-test-frontend'.format(ver_mode),
'{}-oss-test-backend'.format(ver_mode),
'{}-oss-integration-tests'.format(ver_mode),
"{}-oss-build-e2e-publish".format(ver_mode),
"{}-oss-test-frontend".format(ver_mode),
"{}-oss-test-backend".format(ver_mode),
"{}-oss-integration-tests".format(ver_mode),
],
environment=environment,
environment = environment,
)
pipelines = [
pipeline(
name='{}-oss-build-e2e-publish'.format(ver_mode),
edition='oss',
trigger=trigger,
services=[],
steps=init_steps + build_steps + publish_steps,
environment=environment,
volumes=volumes,
name = "{}-oss-build-e2e-publish".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = init_steps + build_steps + publish_steps,
environment = environment,
volumes = volumes,
),
test_frontend(trigger, ver_mode, committish=committish),
test_backend(trigger, ver_mode, committish=committish),
test_frontend(trigger, ver_mode),
test_backend(trigger, ver_mode),
pipeline(
name='{}-oss-integration-tests'.format(ver_mode),
edition='oss',
trigger=trigger,
services=services,
steps=[
download_grabpl_step(),
identify_runner_step(),
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
wire_install_step(),
]
+ integration_test_steps,
environment=environment,
volumes=volumes,
name = "{}-oss-integration-tests".format(ver_mode),
edition = "oss",
trigger = trigger,
services = services,
steps = [
download_grabpl_step(),
identify_runner_step(),
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
wire_install_step(),
] +
integration_test_steps,
environment = environment,
volumes = volumes,
),
windows_pipeline,
]
return pipelines
def enterprise_pipelines(ver_mode = ver_mode, trigger = release_trigger):
"""Generates all pipelines used for Grafana Enterprise.
def enterprise_pipelines(ver_mode=ver_mode, trigger=release_trigger):
if ver_mode == 'release':
committish = '${DRONE_TAG}'
elif ver_mode == 'release-branch':
committish = '${DRONE_BRANCH}'
Args:
ver_mode: controls which steps are included in the pipeline.
Defaults to 'release'.
trigger: controls which events can trigger the pipeline execution.
Defaults to tag events for tags with a 'v' prefix.
Returns:
List of Drone pipelines.
"""
if ver_mode == "release":
committish = "${DRONE_TAG}"
elif ver_mode == "release-branch":
committish = "${DRONE_BRANCH}"
else:
committish = '${DRONE_COMMIT}'
committish = "${DRONE_COMMIT}"
environment = {'EDITION': 'enterprise'}
environment = {"EDITION": "enterprise"}
services = integration_test_services(edition='enterprise')
services = integration_test_services(edition = "enterprise")
volumes = integration_test_services_volumes()
init_steps = [
download_grabpl_step(),
identify_runner_step(),
clone_enterprise_step(committish=committish),
clone_enterprise_step(committish = committish),
init_enterprise_step(ver_mode),
compile_build_cmd('enterprise'),
compile_build_cmd("enterprise"),
] + with_deps(
[
wire_install_step(),
@ -286,50 +282,56 @@ def enterprise_pipelines(ver_mode=ver_mode, trigger=release_trigger):
verify_gen_jsonnet_step(),
],
[
'init-enterprise',
"init-enterprise",
],
)
build_steps = [
build_backend_step(edition='enterprise', ver_mode=ver_mode),
build_frontend_step(edition='enterprise', ver_mode=ver_mode),
build_frontend_package_step(edition='enterprise', ver_mode=ver_mode),
build_plugins_step(edition='enterprise', ver_mode=ver_mode),
build_backend_step(edition = "enterprise", ver_mode = ver_mode),
build_frontend_step(edition = "enterprise", ver_mode = ver_mode),
build_frontend_package_step(edition = "enterprise", ver_mode = ver_mode),
build_plugins_step(edition = "enterprise", ver_mode = ver_mode),
package_step(
edition='enterprise',
ver_mode=ver_mode,
edition = "enterprise",
ver_mode = ver_mode,
),
copy_packages_for_docker_step(),
build_docker_images_step(edition='enterprise', ver_mode=ver_mode, publish=True),
build_docker_images_step(edition = "enterprise", publish = True),
build_docker_images_step(
edition='enterprise', ver_mode=ver_mode, publish=True, ubuntu=True
edition = "enterprise",
publish = True,
ubuntu = True,
),
grafana_server_step(edition='enterprise'),
e2e_tests_step('dashboards-suite', tries=3),
e2e_tests_step('smoke-tests-suite', tries=3),
e2e_tests_step('panels-suite', tries=3),
e2e_tests_step('various-suite', tries=3),
grafana_server_step(edition = "enterprise"),
e2e_tests_step("dashboards-suite", tries = 3),
e2e_tests_step("smoke-tests-suite", tries = 3),
e2e_tests_step("panels-suite", tries = 3),
e2e_tests_step("various-suite", tries = 3),
e2e_tests_artifacts(),
]
publish_steps = []
if ver_mode in (
'release',
'release-branch',
"release",
"release-branch",
):
upload_packages_enterprise = upload_packages_step(
edition='enterprise', ver_mode=ver_mode, trigger=trigger_oss
edition = "enterprise",
ver_mode = ver_mode,
trigger = trigger_oss,
)
upload_packages_enterprise['depends_on'] = ['package']
upload_packages_enterprise["depends_on"] = ["package"]
publish_steps.extend(
[
upload_cdn_step(
edition='enterprise', ver_mode=ver_mode, trigger=trigger_oss
edition = "enterprise",
ver_mode = ver_mode,
trigger = trigger_oss,
),
upload_packages_enterprise,
]
],
)
integration_test_steps = [
@ -338,91 +340,103 @@ def enterprise_pipelines(ver_mode=ver_mode, trigger=release_trigger):
]
windows_pipeline = pipeline(
name='{}-enterprise-windows'.format(ver_mode),
edition='enterprise',
trigger=trigger,
steps=get_windows_steps(edition='enterprise', ver_mode=ver_mode),
platform='windows',
depends_on=[
name = "{}-enterprise-windows".format(ver_mode),
edition = "enterprise",
trigger = trigger,
steps = get_windows_steps(edition = "enterprise", ver_mode = ver_mode),
platform = "windows",
depends_on = [
# 'enterprise-build-e2e-publish-{}'.format(ver_mode),
'{}-enterprise-build-e2e-publish'.format(ver_mode),
'{}-enterprise-test-frontend'.format(ver_mode),
'{}-enterprise-test-backend'.format(ver_mode),
'{}-enterprise-integration-tests'.format(ver_mode),
"{}-enterprise-build-e2e-publish".format(ver_mode),
"{}-enterprise-test-frontend".format(ver_mode),
"{}-enterprise-test-backend".format(ver_mode),
"{}-enterprise-integration-tests".format(ver_mode),
],
environment=environment,
environment = environment,
)
pipelines = [
pipeline(
name='{}-enterprise-build-e2e-publish'.format(ver_mode),
edition='enterprise',
trigger=trigger,
services=[],
steps=init_steps + build_steps + publish_steps,
environment=environment,
volumes=volumes,
name = "{}-enterprise-build-e2e-publish".format(ver_mode),
edition = "enterprise",
trigger = trigger,
services = [],
steps = init_steps + build_steps + publish_steps,
environment = environment,
volumes = volumes,
),
test_frontend_enterprise(trigger, ver_mode, committish=committish),
test_backend_enterprise(trigger, ver_mode, committish=committish),
test_frontend_enterprise(trigger, ver_mode, committish = committish),
test_backend_enterprise(trigger, ver_mode, committish = committish),
pipeline(
name='{}-enterprise-integration-tests'.format(ver_mode),
edition='enterprise',
trigger=trigger,
services=services,
steps=[
download_grabpl_step(),
identify_runner_step(),
clone_enterprise_step(committish=committish),
init_enterprise_step(ver_mode),
]
+ with_deps(
[
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
],
[
'init-enterprise',
],
)
+ [
wire_install_step(),
]
+ integration_test_steps
+ [
redis_integration_tests_step(),
memcached_integration_tests_step(),
],
environment=environment,
volumes=volumes,
name = "{}-enterprise-integration-tests".format(ver_mode),
edition = "enterprise",
trigger = trigger,
services = services,
steps = [
download_grabpl_step(),
identify_runner_step(),
clone_enterprise_step(committish = committish),
init_enterprise_step(ver_mode),
] +
with_deps(
[
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
],
[
"init-enterprise",
],
) +
[
wire_install_step(),
] +
integration_test_steps +
[
redis_integration_tests_step(),
memcached_integration_tests_step(),
],
environment = environment,
volumes = volumes,
),
windows_pipeline,
]
return pipelines
def enterprise2_pipelines(prefix = "", ver_mode = ver_mode, trigger = release_trigger):
"""Generate the next generation of pipelines for Grafana Enterprise.
def enterprise2_pipelines(prefix='', ver_mode=ver_mode, trigger=release_trigger):
if ver_mode == 'release':
committish = '${DRONE_TAG}'
elif ver_mode == 'release-branch':
committish = '${DRONE_BRANCH}'
Args:
prefix: a prefix for the pipeline name used to differentiate multiple instances of
the same pipeline.
Defaults to ''.
ver_mode: controls which steps are included in the pipeline.
Defaults to 'release'.
trigger: controls which events can trigger the pipeline execution.
Defaults to tag events for tags with a 'v' prefix.
Returns:
List of Drone pipelines.
"""
if ver_mode == "release":
committish = "${DRONE_TAG}"
elif ver_mode == "release-branch":
committish = "${DRONE_BRANCH}"
else:
committish = '${DRONE_COMMIT}'
committish = "${DRONE_COMMIT}"
environment = {
'EDITION': 'enterprise2',
"EDITION": "enterprise2",
}
services = integration_test_services(edition='enterprise')
volumes = integration_test_services_volumes()
init_steps = [
download_grabpl_step(),
identify_runner_step(),
clone_enterprise_step(committish=committish),
clone_enterprise_step(committish = committish),
init_enterprise_step(ver_mode),
compile_build_cmd('enterprise'),
compile_build_cmd("enterprise"),
] + with_deps(
[
wire_install_step(),
@ -430,104 +444,107 @@ def enterprise2_pipelines(prefix='', ver_mode=ver_mode, trigger=release_trigger)
verify_gen_cue_step(),
],
[
'init-enterprise',
"init-enterprise",
],
)
build_steps = [
build_frontend_step(edition='enterprise', ver_mode=ver_mode),
build_frontend_package_step(edition='enterprise', ver_mode=ver_mode),
build_plugins_step(edition='enterprise', ver_mode=ver_mode),
build_frontend_step(edition = "enterprise", ver_mode = ver_mode),
build_frontend_package_step(edition = "enterprise", ver_mode = ver_mode),
build_plugins_step(edition = "enterprise", ver_mode = ver_mode),
build_backend_step(
edition='enterprise2', ver_mode=ver_mode, variants=['linux-amd64']
edition = "enterprise2",
ver_mode = ver_mode,
variants = ["linux-amd64"],
),
]
fetch_images = fetch_images_step('enterprise2')
fetch_images = fetch_images_step("enterprise2")
fetch_images.update(
{'depends_on': ['build-docker-images', 'build-docker-images-ubuntu']}
{"depends_on": ["build-docker-images", "build-docker-images-ubuntu"]},
)
upload_cdn = upload_cdn_step(edition='enterprise2', ver_mode=ver_mode)
upload_cdn['environment'].update(
{'ENTERPRISE2_CDN_PATH': from_secret('enterprise2-cdn-path')}
upload_cdn = upload_cdn_step(edition = "enterprise2", ver_mode = ver_mode)
upload_cdn["environment"].update(
{"ENTERPRISE2_CDN_PATH": from_secret("enterprise2-cdn-path")},
)
build_steps.extend(
[
package_step(
edition='enterprise2',
ver_mode=ver_mode,
variants=['linux-amd64'],
edition = "enterprise2",
ver_mode = ver_mode,
variants = ["linux-amd64"],
),
upload_cdn,
copy_packages_for_docker_step(edition='enterprise2'),
copy_packages_for_docker_step(edition = "enterprise2"),
build_docker_images_step(
edition='enterprise2', ver_mode=ver_mode, publish=True
edition = "enterprise2",
publish = True,
),
build_docker_images_step(
edition='enterprise2', ver_mode=ver_mode, publish=True, ubuntu=True
edition = "enterprise2",
publish = True,
ubuntu = True,
),
fetch_images,
publish_images_step(
'enterprise2',
'release',
mode='enterprise2',
docker_repo='${{DOCKER_ENTERPRISE2_REPO}}',
"enterprise2",
"release",
mode = "enterprise2",
docker_repo = "${{DOCKER_ENTERPRISE2_REPO}}",
),
]
],
)
publish_steps = []
if ver_mode in (
'release',
'release-branch',
"release",
"release-branch",
):
step = upload_packages_step(edition='enterprise2', ver_mode=ver_mode)
step['depends_on'] = ['package-enterprise2']
step = upload_packages_step(edition = "enterprise2", ver_mode = ver_mode)
step["depends_on"] = ["package-enterprise2"]
publish_steps.append(step)
pipelines = [
pipeline(
name='{}{}-enterprise2-build-e2e-publish'.format(prefix, ver_mode),
edition='enterprise',
trigger=trigger,
services=[],
steps=init_steps + build_steps + publish_steps,
volumes=volumes,
environment=environment,
name = "{}{}-enterprise2-build-e2e-publish".format(prefix, ver_mode),
edition = "enterprise",
trigger = trigger,
services = [],
steps = init_steps + build_steps + publish_steps,
volumes = volumes,
environment = environment,
),
]
return pipelines
def publish_artifacts_step(mode):
security = ''
if mode == 'security':
security = '--security '
security = ""
if mode == "security":
security = "--security "
return {
'name': 'publish-artifacts',
'image': publish_image,
'environment': {
'GCP_KEY': from_secret('gcp_key'),
'PRERELEASE_BUCKET': from_secret('prerelease_bucket'),
"name": "publish-artifacts",
"image": publish_image,
"environment": {
"GCP_KEY": from_secret("gcp_key"),
"PRERELEASE_BUCKET": from_secret("prerelease_bucket"),
},
'commands': [
'./bin/grabpl artifacts publish {}--tag $${{DRONE_TAG}} --src-bucket $${{PRERELEASE_BUCKET}}'.format(
security
)
"commands": [
"./bin/grabpl artifacts publish {}--tag $${{DRONE_TAG}} --src-bucket $${{PRERELEASE_BUCKET}}".format(
security,
),
],
'depends_on': ['grabpl'],
"depends_on": ["grabpl"],
}
def publish_artifacts_pipelines(mode):
trigger = {
'event': ['promote'],
'target': [mode],
"event": ["promote"],
"target": [mode],
}
steps = [
download_grabpl_step(),
@ -536,65 +553,69 @@ def publish_artifacts_pipelines(mode):
return [
pipeline(
name='publish-artifacts-{}'.format(mode),
trigger=trigger,
steps=steps,
edition="all",
environment={'EDITION': 'all'},
)
name = "publish-artifacts-{}".format(mode),
trigger = trigger,
steps = steps,
edition = "all",
environment = {"EDITION": "all"},
),
]
def publish_packages_pipeline():
"""Generates pipelines used for publishing packages for both OSS and enterprise.
Returns:
List of Drone pipelines. One for each of OSS and enterprise packages.
"""
trigger = {
'event': ['promote'],
'target': ['public'],
"event": ["promote"],
"target": ["public"],
}
oss_steps = [
download_grabpl_step(),
compile_build_cmd(),
publish_linux_packages_step(edition='oss', package_manager='deb'),
publish_linux_packages_step(edition='oss', package_manager='rpm'),
publish_grafanacom_step(edition='oss', ver_mode='release'),
publish_linux_packages_step(edition = "oss", package_manager = "deb"),
publish_linux_packages_step(edition = "oss", package_manager = "rpm"),
publish_grafanacom_step(edition = "oss", ver_mode = "release"),
]
enterprise_steps = [
download_grabpl_step(),
compile_build_cmd(),
publish_linux_packages_step(edition='enterprise', package_manager='deb'),
publish_linux_packages_step(edition='enterprise', package_manager='rpm'),
publish_grafanacom_step(edition='enterprise', ver_mode='release'),
publish_linux_packages_step(edition = "enterprise", package_manager = "deb"),
publish_linux_packages_step(edition = "enterprise", package_manager = "rpm"),
publish_grafanacom_step(edition = "enterprise", ver_mode = "release"),
]
deps = [
'publish-artifacts-public',
'publish-docker-oss-public',
'publish-docker-enterprise-public',
"publish-artifacts-public",
"publish-docker-oss-public",
"publish-docker-enterprise-public",
]
return [
pipeline(
name='publish-packages-oss',
trigger=trigger,
steps=oss_steps,
edition="all",
depends_on=deps,
environment={'EDITION': 'oss'},
name = "publish-packages-oss",
trigger = trigger,
steps = oss_steps,
edition = "all",
depends_on = deps,
environment = {"EDITION": "oss"},
),
pipeline(
name='publish-packages-enterprise',
trigger=trigger,
steps=enterprise_steps,
edition="all",
depends_on=deps,
environment={'EDITION': 'enterprise'},
name = "publish-packages-enterprise",
trigger = trigger,
steps = enterprise_steps,
edition = "all",
depends_on = deps,
environment = {"EDITION": "enterprise"},
),
]
def publish_npm_pipelines():
trigger = {
'event': ['promote'],
'target': ['public'],
"event": ["promote"],
"target": ["public"],
}
steps = [
compile_build_cmd(),
@ -605,26 +626,25 @@ def publish_npm_pipelines():
return [
pipeline(
name='publish-npm-packages-public',
trigger=trigger,
steps=steps,
edition="all",
environment={'EDITION': 'all'},
)
name = "publish-npm-packages-public",
trigger = trigger,
steps = steps,
edition = "all",
environment = {"EDITION": "all"},
),
]
def artifacts_page_pipeline():
trigger = {
'event': ['promote'],
'target': 'security',
"event": ["promote"],
"target": "security",
}
return [
pipeline(
name='publish-artifacts-page',
trigger=trigger,
steps=[download_grabpl_step(), artifacts_page_step()],
edition="all",
environment={'EDITION': 'all'},
)
name = "publish-artifacts-page",
trigger = trigger,
steps = [download_grabpl_step(), artifacts_page_step()],
edition = "all",
environment = {"EDITION": "all"},
),
]

View File

@ -1,38 +1,43 @@
"""
This module contains steps and pipelines publishing to AWS Marketplace.
"""
load(
'scripts/drone/steps/lib.star',
'download_grabpl_step',
'publish_images_step',
'compile_build_cmd',
'fetch_images_step',
'publish_image',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"fetch_images_step",
"publish_image",
)
load('scripts/drone/vault.star', 'from_secret')
load("scripts/drone/vault.star", "from_secret")
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/utils/utils.star",
"pipeline",
)
def publish_aws_marketplace_step():
return {
'name': 'publish-aws-marketplace',
'image': publish_image,
'commands': ['./bin/build publish aws --image grafana/grafana-enterprise --repo grafana-labs/grafanaenterprise --product 422b46fb-bea6-4f27-8bcc-832117bd627e'],
'depends_on': ['fetch-images-enterprise'],
'environment': {
'AWS_REGION': from_secret('aws_region'),
'AWS_ACCESS_KEY_ID': from_secret('aws_access_key_id'),
'AWS_SECRET_ACCESS_KEY': from_secret('aws_secret_access_key'),
"name": "publish-aws-marketplace",
"image": publish_image,
"commands": ["./bin/build publish aws --image grafana/grafana-enterprise --repo grafana-labs/grafanaenterprise --product 422b46fb-bea6-4f27-8bcc-832117bd627e"],
"depends_on": ["fetch-images-enterprise"],
"environment": {
"AWS_REGION": from_secret("aws_region"),
"AWS_ACCESS_KEY_ID": from_secret("aws_access_key_id"),
"AWS_SECRET_ACCESS_KEY": from_secret("aws_secret_access_key"),
},
'volumes': [{'name': 'docker', 'path': '/var/run/docker.sock'}],
"volumes": [{"name": "docker", "path": "/var/run/docker.sock"}],
}
def publish_aws_marketplace_pipeline(mode):
trigger = {
'event': ['promote'],
'target': [mode],
"event": ["promote"],
"target": [mode],
}
return [pipeline(
name='publish-aws-marketplace-{}'.format(mode), trigger=trigger, steps=[compile_build_cmd(), fetch_images_step('enterprise'), publish_aws_marketplace_step()], edition="", depends_on = ['publish-docker-enterprise-public'], environment = {'EDITION': 'enterprise2'}
),]
name = "publish-aws-marketplace-{}".format(mode),
trigger = trigger,
steps = [compile_build_cmd(), fetch_images_step("enterprise"), publish_aws_marketplace_step()],
edition = "",
depends_on = ["publish-docker-enterprise-public"],
environment = {"EDITION": "enterprise2"},
)]

View File

@ -1,52 +1,55 @@
load(
'scripts/drone/steps/lib.star',
'benchmark_ldap_step',
'betterer_frontend_step',
'build_backend_step',
'build_docker_images_step',
'build_frontend_package_step',
'build_frontend_step',
'build_image',
'build_plugins_step',
'build_storybook_step',
'cloud_plugins_e2e_tests_step',
'compile_build_cmd',
'copy_packages_for_docker_step',
'download_grabpl_step',
'e2e_tests_artifacts',
'e2e_tests_step',
'enterprise_downstream_step',
'frontend_metrics_step',
'grafana_server_step',
'identify_runner_step',
'memcached_integration_tests_step',
'mysql_integration_tests_step',
'package_step',
'postgres_integration_tests_step',
'publish_images_step',
'redis_integration_tests_step',
'release_canary_npm_packages_step',
'store_storybook_step',
'test_a11y_frontend_step',
'trigger_oss',
'trigger_test_release',
'upload_cdn_step',
'upload_packages_step',
'verify_gen_cue_step',
'verify_gen_jsonnet_step',
'wire_install_step',
'yarn_install_step',
)
"""This module contains the comprehensive build pipeline."""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"build_backend_step",
"build_docker_images_step",
"build_frontend_package_step",
"build_frontend_step",
"build_plugins_step",
"build_storybook_step",
"cloud_plugins_e2e_tests_step",
"compile_build_cmd",
"copy_packages_for_docker_step",
"download_grabpl_step",
"e2e_tests_artifacts",
"e2e_tests_step",
"enterprise_downstream_step",
"frontend_metrics_step",
"grafana_server_step",
"identify_runner_step",
"package_step",
"publish_images_step",
"release_canary_npm_packages_step",
"store_storybook_step",
"test_a11y_frontend_step",
"trigger_oss",
"trigger_test_release",
"upload_cdn_step",
"upload_packages_step",
"verify_gen_cue_step",
"verify_gen_jsonnet_step",
"wire_install_step",
"yarn_install_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
# @unused
def build_e2e(trigger, ver_mode):
edition = 'oss'
environment = {'EDITION': edition}
"""Perform e2e building, testing, and publishing."
Args:
trigger: controls which events can trigger the pipeline execution.
ver_mode: used in the naming of the pipeline.
Returns:
Drone pipeline.
"""
edition = "oss"
environment = {"EDITION": edition}
init_steps = [
identify_runner_step(),
download_grabpl_step(),
@ -60,101 +63,107 @@ def build_e2e(trigger, ver_mode):
build_steps = []
variants = None
if ver_mode == 'pr':
if ver_mode == "pr":
build_steps.extend(
[
trigger_test_release(),
enterprise_downstream_step(ver_mode=ver_mode),
]
enterprise_downstream_step(ver_mode = ver_mode),
],
)
variants = [
'linux-amd64',
'linux-amd64-musl',
'darwin-amd64',
'windows-amd64',
"linux-amd64",
"linux-amd64-musl",
"darwin-amd64",
"windows-amd64",
]
build_steps.extend(
[
build_backend_step(edition=edition, ver_mode=ver_mode),
build_frontend_step(edition=edition, ver_mode=ver_mode),
build_frontend_package_step(edition=edition, ver_mode=ver_mode),
build_plugins_step(edition=edition, ver_mode=ver_mode),
package_step(edition=edition, ver_mode=ver_mode, variants=variants),
grafana_server_step(edition=edition),
e2e_tests_step('dashboards-suite'),
e2e_tests_step('smoke-tests-suite'),
e2e_tests_step('panels-suite'),
e2e_tests_step('various-suite'),
build_backend_step(edition = edition, ver_mode = ver_mode),
build_frontend_step(edition = edition, ver_mode = ver_mode),
build_frontend_package_step(edition = edition, ver_mode = ver_mode),
build_plugins_step(edition = edition, ver_mode = ver_mode),
package_step(edition = edition, variants = variants, ver_mode = ver_mode),
grafana_server_step(edition = edition),
e2e_tests_step("dashboards-suite"),
e2e_tests_step("smoke-tests-suite"),
e2e_tests_step("panels-suite"),
e2e_tests_step("various-suite"),
cloud_plugins_e2e_tests_step(
'cloud-plugins-suite',
cloud='azure',
trigger=trigger_oss,
"cloud-plugins-suite",
cloud = "azure",
trigger = trigger_oss,
),
e2e_tests_artifacts(),
build_storybook_step(ver_mode=ver_mode),
build_storybook_step(ver_mode = ver_mode),
copy_packages_for_docker_step(),
test_a11y_frontend_step(ver_mode=ver_mode),
]
test_a11y_frontend_step(ver_mode = ver_mode),
],
)
if ver_mode == 'main':
if ver_mode == "main":
build_steps.extend(
[
store_storybook_step(ver_mode=ver_mode, trigger=trigger_oss),
frontend_metrics_step(trigger=trigger_oss),
store_storybook_step(trigger = trigger_oss, ver_mode = ver_mode),
frontend_metrics_step(trigger = trigger_oss),
build_docker_images_step(
edition=edition, ver_mode=ver_mode, publish=False
edition = edition,
publish = False,
),
build_docker_images_step(
edition=edition, ver_mode=ver_mode, publish=False, ubuntu=True
edition = edition,
publish = False,
ubuntu = True,
),
publish_images_step(
edition=edition,
ver_mode=ver_mode,
mode='',
docker_repo='grafana',
trigger=trigger_oss,
docker_repo = "grafana",
edition = edition,
mode = "",
trigger = trigger_oss,
ver_mode = ver_mode,
),
publish_images_step(
edition=edition,
ver_mode=ver_mode,
mode='',
docker_repo='grafana-oss',
trigger=trigger_oss,
docker_repo = "grafana-oss",
edition = edition,
mode = "",
trigger = trigger_oss,
ver_mode = ver_mode,
),
release_canary_npm_packages_step(trigger=trigger_oss),
release_canary_npm_packages_step(trigger = trigger_oss),
upload_packages_step(
edition=edition, ver_mode=ver_mode, trigger=trigger_oss
edition = edition,
trigger = trigger_oss,
ver_mode = ver_mode,
),
upload_cdn_step(
edition=edition, ver_mode=ver_mode, trigger=trigger_oss
edition = edition,
trigger = trigger_oss,
ver_mode = ver_mode,
),
]
],
)
elif ver_mode == 'pr':
elif ver_mode == "pr":
build_steps.extend(
[
build_docker_images_step(
edition=edition,
ver_mode=ver_mode,
archs=[
'amd64',
archs = [
"amd64",
],
)
]
edition = edition,
),
],
)
publish_suffix = ''
if ver_mode == 'main':
publish_suffix = '-publish'
publish_suffix = ""
if ver_mode == "main":
publish_suffix = "-publish"
return pipeline(
name='{}-build-e2e{}'.format(ver_mode, publish_suffix),
edition="oss",
trigger=trigger,
services=[],
steps=init_steps + build_steps,
environment=environment,
name = "{}-build-e2e{}".format(ver_mode, publish_suffix),
edition = "oss",
environment = environment,
services = [],
steps = init_steps + build_steps,
trigger = trigger,
)

View File

@ -1,39 +1,32 @@
load(
'scripts/drone/steps/lib.star',
'build_image',
'yarn_install_step',
'identify_runner_step',
'download_grabpl_step',
'lint_frontend_step',
'codespell_step',
'test_frontend_step',
'build_storybook_step',
'build_docs_website_step',
)
"""
This module returns all the pipelines used in the event of documentation changes along with supporting functions.
"""
load(
'scripts/drone/services/services.star',
'integration_test_services',
'ldap_service',
"scripts/drone/steps/lib.star",
"build_docs_website_step",
"build_image",
"codespell_step",
"download_grabpl_step",
"identify_runner_step",
"yarn_install_step",
)
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/utils/utils.star",
"pipeline",
)
docs_paths = {
'include': [
'*.md',
'docs/**',
'packages/**/*.md',
'latest.json',
"include": [
"*.md",
"docs/**",
"packages/**/*.md",
"latest.json",
],
}
def docs_pipelines(ver_mode, trigger):
environment = {'EDITION': 'oss'}
environment = {"EDITION": "oss"}
steps = [
download_grabpl_step(),
identify_runner_step(),
@ -44,45 +37,42 @@ def docs_pipelines(ver_mode, trigger):
]
return pipeline(
name='{}-docs'.format(ver_mode),
edition='oss',
trigger=trigger,
services=[],
steps=steps,
environment=environment,
name = "{}-docs".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = steps,
environment = environment,
)
def lint_docs():
return {
'name': 'lint-docs',
'image': build_image,
'depends_on': [
'yarn-install',
"name": "lint-docs",
"image": build_image,
"depends_on": [
"yarn-install",
],
'environment': {
'NODE_OPTIONS': '--max_old_space_size=8192',
"environment": {
"NODE_OPTIONS": "--max_old_space_size=8192",
},
'commands': [
'yarn run prettier:checkDocs',
"commands": [
"yarn run prettier:checkDocs",
],
}
def trigger_docs_main():
return {
'branch': 'main',
'event': [
'push',
"branch": "main",
"event": [
"push",
],
'paths': docs_paths,
"paths": docs_paths,
}
def trigger_docs_pr():
return {
'event': [
'pull_request',
"event": [
"pull_request",
],
'paths': docs_paths,
"paths": docs_paths,
}

View File

@ -1,36 +1,40 @@
"""
This module contains steps and pipelines relating to GitHub.
"""
load(
'scripts/drone/steps/lib.star',
'download_grabpl_step',
'publish_images_step',
'compile_build_cmd',
'fetch_images_step',
'publish_image',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"fetch_images_step",
"publish_image",
)
load('scripts/drone/vault.star', 'from_secret')
load("scripts/drone/vault.star", "from_secret")
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/utils/utils.star",
"pipeline",
)
def publish_github_step():
return {
'name': 'publish-github',
'image': publish_image,
'commands': ['./bin/build publish github --repo $${GH_REGISTRY} --create'],
'depends_on': ['fetch-images-enterprise2'],
'environment': {
'GH_TOKEN': from_secret('github_token'),
'GH_REGISTRY': from_secret('gh_registry'),
"name": "publish-github",
"image": publish_image,
"commands": ["./bin/build publish github --repo $${GH_REGISTRY} --create"],
"depends_on": ["fetch-images-enterprise2"],
"environment": {
"GH_TOKEN": from_secret("github_token"),
"GH_REGISTRY": from_secret("gh_registry"),
},
}
def publish_github_pipeline(mode):
trigger = {
'event': ['promote'],
'target': [mode],
"event": ["promote"],
"target": [mode],
}
return [pipeline(
name='publish-github-{}'.format(mode), trigger=trigger, steps=[compile_build_cmd(), fetch_images_step('enterprise2'), publish_github_step()], edition="", environment = {'EDITION': 'enterprise2'}
),]
name = "publish-github-{}".format(mode),
trigger = trigger,
steps = [compile_build_cmd(), fetch_images_step("enterprise2"), publish_github_step()],
edition = "",
environment = {"EDITION": "enterprise2"},
)]

View File

@ -1,32 +1,41 @@
load(
'scripts/drone/steps/lib.star',
'compile_build_cmd',
'download_grabpl_step',
'identify_runner_step',
'verify_gen_cue_step',
'verify_gen_jsonnet_step',
'wire_install_step',
'postgres_integration_tests_step',
'mysql_integration_tests_step',
)
"""
This module returns the pipeline used for integration tests.
"""
load(
'scripts/drone/services/services.star',
'integration_test_services',
'integration_test_services_volumes',
'ldap_service',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"download_grabpl_step",
"identify_runner_step",
"mysql_integration_tests_step",
"postgres_integration_tests_step",
"verify_gen_cue_step",
"verify_gen_jsonnet_step",
"wire_install_step",
)
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/services/services.star",
"integration_test_services",
"integration_test_services_volumes",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def integration_tests(trigger, prefix):
environment = {'EDITION': 'oss'}
"""Generate a pipeline for integration tests.
services = integration_test_services(edition="oss")
Args:
trigger: controls which events can trigger the pipeline execution.
prefix: used in the naming of the pipeline.
Returns:
Drone pipeline.
"""
environment = {"EDITION": "oss"}
services = integration_test_services(edition = "oss")
volumes = integration_test_services_volumes()
init_steps = [
@ -44,11 +53,11 @@ def integration_tests(trigger, prefix):
]
return pipeline(
name='{}-integration-tests'.format(prefix),
edition='oss',
trigger=trigger,
environment=environment,
services=services,
volumes=volumes,
steps=init_steps + test_steps,
name = "{}-integration-tests".format(prefix),
edition = "oss",
trigger = trigger,
environment = environment,
services = services,
volumes = volumes,
steps = init_steps + test_steps,
)

View File

@ -1,23 +1,34 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'wire_install_step',
'lint_backend_step',
'lint_drone_step',
'compile_build_cmd',
)
"""
This module returns the pipeline used for linting backend code.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"identify_runner_step",
"lint_backend_step",
"lint_drone_step",
"wire_install_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def lint_backend_pipeline(trigger, ver_mode):
environment = {'EDITION': 'oss'}
"""Generates the pipelines used linting backend code.
Args:
trigger: controls which events can trigger the pipeline execution.
ver_mode: used in the naming of the pipeline.
Returns:
Drone pipeline.
"""
environment = {"EDITION": "oss"}
wire_step = wire_install_step()
wire_step.update({'depends_on': []})
wire_step.update({"depends_on": []})
init_steps = [
identify_runner_step(),
@ -29,14 +40,14 @@ def lint_backend_pipeline(trigger, ver_mode):
lint_backend_step(),
]
if ver_mode == 'main':
if ver_mode == "main":
test_steps.append(lint_drone_step())
return pipeline(
name='{}-lint-backend'.format(ver_mode),
edition="oss",
trigger=trigger,
services=[],
steps=init_steps + test_steps,
environment=environment,
name = "{}-lint-backend".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = init_steps + test_steps,
environment = environment,
)

View File

@ -1,18 +1,29 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'yarn_install_step',
'lint_frontend_step',
)
"""
This module returns the pipeline used for linting frontend code.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"identify_runner_step",
"lint_frontend_step",
"yarn_install_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def lint_frontend_pipeline(trigger, ver_mode):
environment = {'EDITION': 'oss'}
"""Generates the pipelines used linting frontend code.
Args:
trigger: controls which events can trigger the pipeline execution.
ver_mode: used in the naming of the pipeline.
Returns:
Drone pipeline.
"""
environment = {"EDITION": "oss"}
init_steps = [
identify_runner_step(),
@ -24,10 +35,10 @@ def lint_frontend_pipeline(trigger, ver_mode):
]
return pipeline(
name='{}-lint-frontend'.format(ver_mode),
edition="oss",
trigger=trigger,
services=[],
steps=init_steps + test_steps,
environment=environment,
name = "{}-lint-frontend".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = init_steps + test_steps,
environment = environment,
)

View File

@ -1,75 +1,97 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'download_grabpl_step',
'publish_images_step',
'compile_build_cmd',
'fetch_images_step',
)
"""
This module returns the pipeline used for publishing Docker images and its steps.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"download_grabpl_step",
"fetch_images_step",
"identify_runner_step",
"publish_images_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def publish_image_steps(edition, mode, docker_repo):
"""Generates the steps used for publising Docker images using grabpl.
Args:
edition: controls which version of an image is fetched in the case of a release.
It also controls which publishing implementation is used.
If edition == 'oss', it additionally publishes the grafana/grafana-oss repository.
mode: uses to control the publishing of security images when mode == 'security'.
docker_repo: the Docker image name.
It is combined with the 'grafana/' library prefix.
Returns:
List of Drone steps.
"""
steps = [
identify_runner_step(),
download_grabpl_step(),
compile_build_cmd(),
fetch_images_step(edition),
publish_images_step(edition, 'release', mode, docker_repo),
publish_images_step(edition, "release", mode, docker_repo),
]
if edition == 'oss':
if edition == "oss":
steps.append(
publish_images_step(edition, 'release', mode, 'grafana/grafana-oss')
publish_images_step(edition, "release", mode, "grafana/grafana-oss"),
)
return steps
def publish_image_pipelines_public():
mode = 'public'
"""Generates the pipeline used for publising public Docker images.
Returns:
Drone pipeline
"""
mode = "public"
trigger = {
'event': ['promote'],
'target': [mode],
"event": ["promote"],
"target": [mode],
}
return [
pipeline(
name='publish-docker-oss-{}'.format(mode),
trigger=trigger,
steps=publish_image_steps(edition='oss', mode=mode, docker_repo='grafana'),
edition="",
environment={'EDITION': 'oss'},
name = "publish-docker-oss-{}".format(mode),
trigger = trigger,
steps = publish_image_steps(edition = "oss", mode = mode, docker_repo = "grafana"),
edition = "",
environment = {"EDITION": "oss"},
),
pipeline(
name='publish-docker-enterprise-{}'.format(mode),
trigger=trigger,
steps=publish_image_steps(
edition='enterprise', mode=mode, docker_repo='grafana-enterprise'
name = "publish-docker-enterprise-{}".format(mode),
trigger = trigger,
steps = publish_image_steps(
edition = "enterprise",
mode = mode,
docker_repo = "grafana-enterprise",
),
edition="",
environment={'EDITION': 'enterprise'},
edition = "",
environment = {"EDITION": "enterprise"},
),
]
def publish_image_pipelines_security():
mode = 'security'
mode = "security"
trigger = {
'event': ['promote'],
'target': [mode],
"event": ["promote"],
"target": [mode],
}
return [
pipeline(
name='publish-docker-enterprise-{}'.format(mode),
trigger=trigger,
steps=publish_image_steps(
edition='enterprise', mode=mode, docker_repo='grafana-enterprise'
name = "publish-docker-enterprise-{}".format(mode),
trigger = trigger,
steps = publish_image_steps(
edition = "enterprise",
mode = mode,
docker_repo = "grafana-enterprise",
),
edition="",
environment={'EDITION': 'enterprise'},
edition = "",
environment = {"EDITION": "enterprise"},
),
]

View File

@ -1,49 +1,50 @@
load('scripts/drone/steps/lib.star', 'build_image', 'compile_build_cmd')
"""
This module returns a Drone step and pipeline for linting with shellcheck.
"""
load("scripts/drone/steps/lib.star", "build_image", "compile_build_cmd")
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/utils/utils.star",
"pipeline",
)
trigger = {
'event': [
'pull_request',
"event": [
"pull_request",
],
'paths': {
'exclude': [
'*.md',
'docs/**',
'latest.json',
"paths": {
"exclude": [
"*.md",
"docs/**",
"latest.json",
],
'include': ['scripts/**/*.sh'],
"include": ["scripts/**/*.sh"],
},
}
def shellcheck_step():
return {
'name': 'shellcheck',
'image': build_image,
'depends_on': [
'compile-build-cmd',
"name": "shellcheck",
"image": build_image,
"depends_on": [
"compile-build-cmd",
],
'commands': [
'./bin/build shellcheck',
"commands": [
"./bin/build shellcheck",
],
}
def shellcheck_pipeline():
environment = {'EDITION': 'oss'}
environment = {"EDITION": "oss"}
steps = [
compile_build_cmd(),
shellcheck_step(),
]
return pipeline(
name='pr-shellcheck',
edition="oss",
trigger=trigger,
services=[],
steps=steps,
environment=environment,
name = "pr-shellcheck",
edition = "oss",
trigger = trigger,
services = [],
steps = steps,
environment = environment,
)

View File

@ -1,30 +1,41 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'download_grabpl_step',
'wire_install_step',
'test_backend_step',
'test_backend_integration_step',
'verify_gen_cue_step',
'verify_gen_jsonnet_step',
'compile_build_cmd',
'clone_enterprise_step',
'init_enterprise_step',
)
"""
This module returns the pipeline used for testing backend code.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
'with_deps',
"scripts/drone/utils/utils.star",
"pipeline",
"with_deps",
)
load(
"scripts/drone/steps/lib.star",
"clone_enterprise_step",
"compile_build_cmd",
"download_grabpl_step",
"identify_runner_step",
"init_enterprise_step",
"test_backend_integration_step",
"test_backend_step",
"verify_gen_cue_step",
"verify_gen_jsonnet_step",
"wire_install_step",
)
def test_backend(trigger, ver_mode):
"""Generates the pipeline used for testing OSS backend code.
def test_backend(trigger, ver_mode, committish):
environment = {'EDITION': 'oss'}
Args:
trigger: a Drone trigger for the pipeline.
ver_mode: affects the pipeline name.
Returns:
Drone pipeline.
"""
environment = {"EDITION": "oss"}
steps = [
identify_runner_step(),
compile_build_cmd(edition='oss'),
compile_build_cmd(edition = "oss"),
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
wire_install_step(),
@ -32,21 +43,31 @@ def test_backend(trigger, ver_mode, committish):
test_backend_integration_step(),
]
pipeline_name = '{}-test-backend'.format(ver_mode)
pipeline_name = "{}-test-backend".format(ver_mode)
if ver_mode in ("release-branch", "release"):
pipeline_name = '{}-{}-test-backend'.format(ver_mode, 'oss')
pipeline_name = "{}-{}-test-backend".format(ver_mode, "oss")
return pipeline(
name=pipeline_name,
edition='oss',
trigger=trigger,
steps=steps,
environment=environment,
name = pipeline_name,
edition = "oss",
trigger = trigger,
steps = steps,
environment = environment,
)
def test_backend_enterprise(trigger, ver_mode, committish, edition = "enterprise"):
"""Generates the pipeline used for testing backend enterprise code.
def test_backend_enterprise(trigger, ver_mode, committish, edition="enterprise"):
environment = {'EDITION': edition}
Args:
trigger: a Drone trigger for the pipeline.
ver_mode: affects the pipeline name.
committish: controls what revision of enterprise code to test with.
edition: affects the clone step in the pipeline and also affects the pipeline name.
Returns:
Drone pipeline.
"""
environment = {"EDITION": edition}
steps = (
[
@ -55,31 +76,31 @@ def test_backend_enterprise(trigger, ver_mode, committish, edition="enterprise")
init_enterprise_step(ver_mode),
identify_runner_step(),
compile_build_cmd(edition),
]
+ with_deps(
] +
with_deps(
[
verify_gen_cue_step(),
verify_gen_jsonnet_step(),
],
[
'init-enterprise',
"init-enterprise",
],
)
+ [
) +
[
wire_install_step(),
test_backend_step(),
test_backend_integration_step(),
]
)
pipeline_name = '{}-test-backend'.format(ver_mode)
pipeline_name = "{}-test-backend".format(ver_mode)
if ver_mode in ("release-branch", "release"):
pipeline_name = '{}-{}-test-backend'.format(ver_mode, edition)
pipeline_name = "{}-{}-test-backend".format(ver_mode, edition)
return pipeline(
name=pipeline_name,
edition=edition,
trigger=trigger,
steps=steps,
environment=environment,
name = pipeline_name,
edition = edition,
trigger = trigger,
steps = steps,
environment = environment,
)

View File

@ -1,47 +1,68 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'clone_enterprise_step',
'init_enterprise_step',
'download_grabpl_step',
'yarn_install_step',
'betterer_frontend_step',
'test_frontend_step',
)
"""
This module returns the pipeline used for testing backend code.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
'with_deps',
"scripts/drone/utils/utils.star",
"pipeline",
"with_deps",
)
load(
"scripts/drone/steps/lib.star",
"betterer_frontend_step",
"clone_enterprise_step",
"download_grabpl_step",
"identify_runner_step",
"init_enterprise_step",
"test_frontend_step",
"yarn_install_step",
)
def test_frontend(trigger, ver_mode):
"""Generates the pipeline used for testing frontend code.
def test_frontend(trigger, ver_mode, committish):
environment = {'EDITION': 'oss'}
Args:
trigger: a Drone trigger for the pipeline
ver_mode: indirectly controls which revision of enterprise code to use.
Returns:
Drone pipeline.
"""
environment = {"EDITION": "oss"}
steps = [
identify_runner_step(),
download_grabpl_step(),
yarn_install_step(),
betterer_frontend_step(edition='oss'),
test_frontend_step(edition='oss'),
betterer_frontend_step(edition = "oss"),
test_frontend_step(edition = "oss"),
]
pipeline_name = '{}-test-frontend'.format(ver_mode)
pipeline_name = "{}-test-frontend".format(ver_mode)
if ver_mode in ("release-branch", "release"):
pipeline_name = '{}-{}-test-frontend'.format(ver_mode, 'oss')
pipeline_name = "{}-{}-test-frontend".format(ver_mode, "oss")
return pipeline(
name=pipeline_name,
edition='oss',
trigger=trigger,
steps=steps,
environment=environment,
name = pipeline_name,
edition = "oss",
trigger = trigger,
steps = steps,
environment = environment,
)
def test_frontend_enterprise(trigger, ver_mode, committish, edition = "enterprise"):
"""Generates the pipeline used for testing frontend enterprise code.
def test_frontend_enterprise(trigger, ver_mode, committish, edition='enterprise'):
environment = {'EDITION': edition}
Args:
trigger: a Drone trigger for the pipeline.
ver_mode: affects the pipeline name.
committish: controls what revision of enterprise code to test with.
edition: affects the clone step in the pipeline and also affects the pipeline name.
Returns:
Drone pipeline.
"""
environment = {"EDITION": edition}
steps = (
[
@ -49,22 +70,22 @@ def test_frontend_enterprise(trigger, ver_mode, committish, edition='enterprise'
init_enterprise_step(ver_mode),
identify_runner_step(),
download_grabpl_step(),
]
+ with_deps([yarn_install_step()], ['init-enterprise'])
+ [
] +
with_deps([yarn_install_step()], ["init-enterprise"]) +
[
betterer_frontend_step(edition),
test_frontend_step(edition),
]
)
pipeline_name = '{}-test-frontend'.format(ver_mode)
pipeline_name = "{}-test-frontend".format(ver_mode)
if ver_mode in ("release-branch", "release"):
pipeline_name = '{}-{}-test-frontend'.format(ver_mode, edition)
pipeline_name = "{}-{}-test-frontend".format(ver_mode, edition)
return pipeline(
name=pipeline_name,
edition=edition,
trigger=trigger,
steps=steps,
environment=environment,
name = pipeline_name,
edition = edition,
trigger = trigger,
steps = steps,
environment = environment,
)

View File

@ -1,43 +1,45 @@
load(
'scripts/drone/steps/lib.star',
'enterprise_downstream_step',
)
"""
This module returns the pipeline used for triggering a downstream pipeline for Grafana Enterprise.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"enterprise_downstream_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
trigger = {
'event': [
'push',
"event": [
"push",
],
'branch': 'main',
'paths': {
'exclude': [
'*.md',
'docs/**',
'latest.json',
"branch": "main",
"paths": {
"exclude": [
"*.md",
"docs/**",
"latest.json",
],
},
}
def enterprise_downstream_pipeline():
environment = {'EDITION': 'oss'}
environment = {"EDITION": "oss"}
steps = [
enterprise_downstream_step(ver_mode='main'),
enterprise_downstream_step(ver_mode = "main"),
]
deps = [
'main-build-e2e-publish',
'main-integration-tests',
"main-build-e2e-publish",
"main-integration-tests",
]
return pipeline(
name='main-trigger-downstream',
edition='oss',
trigger=trigger,
services=[],
steps=steps,
depends_on=deps,
environment=environment,
name = "main-trigger-downstream",
edition = "oss",
trigger = trigger,
services = [],
steps = steps,
depends_on = deps,
environment = environment,
)

View File

@ -1,19 +1,21 @@
load(
'scripts/drone/steps/lib.star',
'identify_runner_step',
'download_grabpl_step',
'lint_drone_step',
'compile_build_cmd',
)
"""
This module returns the pipeline used for verifying Drone configuration.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"download_grabpl_step",
"identify_runner_step",
"lint_drone_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def verify_drone(trigger, ver_mode):
environment = {'EDITION': 'oss'}
environment = {"EDITION": "oss"}
steps = [
identify_runner_step(),
download_grabpl_step(),
@ -21,10 +23,10 @@ def verify_drone(trigger, ver_mode):
lint_drone_step(),
]
return pipeline(
name='{}-verify-drone'.format(ver_mode),
edition="oss",
trigger=trigger,
services=[],
steps=steps,
environment=environment,
name = "{}-verify-drone".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = steps,
environment = environment,
)

View File

@ -0,0 +1,32 @@
"""
This module returns a Drone pipeline that verifies all Starlark files are linted.
"""
load(
"scripts/drone/steps/lib.star",
"compile_build_cmd",
"download_grabpl_step",
"identify_runner_step",
"lint_starlark_step",
)
load(
"scripts/drone/utils/utils.star",
"pipeline",
)
def verify_starlark(trigger, ver_mode):
environment = {"EDITION": "oss"}
steps = [
identify_runner_step(),
download_grabpl_step(),
compile_build_cmd(),
lint_starlark_step(),
]
return pipeline(
name = "{}-verify-starlark".format(ver_mode),
edition = "oss",
trigger = trigger,
services = [],
steps = steps,
environment = environment,
)

View File

@ -1,28 +1,41 @@
load(
'scripts/drone/steps/lib.star',
'get_windows_steps',
)
"""
This module returns the pipeline used for building Grafana on Windows.
"""
load(
'scripts/drone/utils/utils.star',
'pipeline',
"scripts/drone/utils/utils.star",
"pipeline",
)
load(
"scripts/drone/steps/lib.star",
"get_windows_steps",
)
def windows(trigger, edition, ver_mode):
environment = {'EDITION': edition}
"""Generates the pipeline used for building Grafana on Windows.
Args:
trigger: a Drone trigger for the pipeline.
edition: controls whether enterprise code is included in the pipeline steps.
ver_mode: controls whether a pre-release or actual release pipeline is generated.
Also indirectly controls which version of enterprise code is used.
Returns:
Drone pipeline.
"""
environment = {"EDITION": edition}
return pipeline(
name='main-windows',
edition=edition,
trigger=dict(trigger, repo=['grafana/grafana']),
steps=get_windows_steps(edition, ver_mode),
depends_on=[
'main-test-frontend',
'main-test-backend',
'main-build-e2e-publish',
'main-integration-tests',
name = "main-windows",
edition = edition,
trigger = dict(trigger, repo = ["grafana/grafana"]),
steps = get_windows_steps(edition, ver_mode),
depends_on = [
"main-test-frontend",
"main-test-backend",
"main-build-e2e-publish",
"main-integration-tests",
],
platform='windows',
environment=environment,
platform = "windows",
environment = environment,
)

View File

@ -1,64 +1,66 @@
"""
This module has functions for Drone services to be used in pipelines.
"""
def integration_test_services_volumes():
return [
{'name': 'postgres', 'temp': {'medium': 'memory'}},
{'name': 'mysql', 'temp': {'medium': 'memory'}},
{"name": "postgres", "temp": {"medium": "memory"}},
{"name": "mysql", "temp": {"medium": "memory"}},
]
def integration_test_services(edition):
services = [
{
'name': 'postgres',
'image': 'postgres:12.3-alpine',
'environment': {
'POSTGRES_USER': 'grafanatest',
'POSTGRES_PASSWORD': 'grafanatest',
'POSTGRES_DB': 'grafanatest',
'PGDATA': '/var/lib/postgresql/data/pgdata',
"name": "postgres",
"image": "postgres:12.3-alpine",
"environment": {
"POSTGRES_USER": "grafanatest",
"POSTGRES_PASSWORD": "grafanatest",
"POSTGRES_DB": "grafanatest",
"PGDATA": "/var/lib/postgresql/data/pgdata",
},
'volumes': [
{'name': 'postgres', 'path': '/var/lib/postgresql/data/pgdata'}
"volumes": [
{"name": "postgres", "path": "/var/lib/postgresql/data/pgdata"},
],
},
{
'name': 'mysql',
'image': 'mysql:5.7.39',
'environment': {
'MYSQL_ROOT_PASSWORD': 'rootpass',
'MYSQL_DATABASE': 'grafana_tests',
'MYSQL_USER': 'grafana',
'MYSQL_PASSWORD': 'password',
"name": "mysql",
"image": "mysql:5.7.39",
"environment": {
"MYSQL_ROOT_PASSWORD": "rootpass",
"MYSQL_DATABASE": "grafana_tests",
"MYSQL_USER": "grafana",
"MYSQL_PASSWORD": "password",
},
'volumes': [{'name': 'mysql', 'path': '/var/lib/mysql'}],
"volumes": [{"name": "mysql", "path": "/var/lib/mysql"}],
},
]
if edition in ('enterprise', 'enterprise2'):
if edition in ("enterprise", "enterprise2"):
services.extend(
[
{
'name': 'redis',
'image': 'redis:6.2.1-alpine',
'environment': {},
"name": "redis",
"image": "redis:6.2.1-alpine",
"environment": {},
},
{
'name': 'memcached',
'image': 'memcached:1.6.9-alpine',
'environment': {},
"name": "memcached",
"image": "memcached:1.6.9-alpine",
"environment": {},
},
]
],
)
return services
def ldap_service():
return {
'name': 'ldap',
'image': 'osixia/openldap:1.4.0',
'environment': {
'LDAP_ADMIN_PASSWORD': 'grafana',
'LDAP_DOMAIN': 'grafana.org',
'SLAPD_ADDITIONAL_MODULES': 'memberof',
"name": "ldap",
"image": "osixia/openldap:1.4.0",
"environment": {
"LDAP_ADMIN_PASSWORD": "grafana",
"LDAP_DOMAIN": "grafana.org",
"SLAPD_ADDITIONAL_MODULES": "memberof",
},
}

File diff suppressed because it is too large Load Diff

View File

@ -1,112 +1,136 @@
load(
'scripts/drone/steps/lib.star',
'download_grabpl_step',
'slack_step',
)
"""
This module contains utility functions for generating Drone pipelines.
"""
load(
'scripts/drone/vault.star',
'from_secret',
'pull_secret',
"scripts/drone/steps/lib.star",
"slack_step",
)
load("scripts/drone/vault.star", "pull_secret")
failure_template = 'Build {{build.number}} failed for commit: <https://github.com/{{repo.owner}}/{{repo.name}}/commit/{{build.commit}}|{{ truncate build.commit 8 }}>: {{build.link}}\nBranch: <https://github.com/{{ repo.owner }}/{{ repo.name }}/commits/{{ build.branch }}|{{ build.branch }}>\nAuthor: {{build.author}}'
drone_change_template = '`.drone.yml` and `starlark` files have been changed on the OSS repo, by: {{build.author}}. \nBranch: <https://github.com/{{ repo.owner }}/{{ repo.name }}/commits/{{ build.branch }}|{{ build.branch }}>\nCommit hash: <https://github.com/{{repo.owner}}/{{repo.name}}/commit/{{build.commit}}|{{ truncate build.commit 8 }}>'
failure_template = "Build {{build.number}} failed for commit: <https://github.com/{{repo.owner}}/{{repo.name}}/commit/{{build.commit}}|{{ truncate build.commit 8 }}>: {{build.link}}\nBranch: <https://github.com/{{ repo.owner }}/{{ repo.name }}/commits/{{ build.branch }}|{{ build.branch }}>\nAuthor: {{build.author}}"
drone_change_template = "`.drone.yml` and `starlark` files have been changed on the OSS repo, by: {{build.author}}. \nBranch: <https://github.com/{{ repo.owner }}/{{ repo.name }}/commits/{{ build.branch }}|{{ build.branch }}>\nCommit hash: <https://github.com/{{repo.owner}}/{{repo.name}}/commit/{{build.commit}}|{{ truncate build.commit 8 }}>"
def pipeline(
name,
edition,
trigger,
steps,
services=[],
platform='linux',
depends_on=[],
environment=None,
volumes=[],
):
if platform != 'windows':
name,
edition,
trigger,
steps,
services = [],
platform = "linux",
depends_on = [],
environment = None,
volumes = []):
"""Generate a Drone Docker pipeline with commonly used values.
In addition to the parameters provided, it configures:
- the use of an image pull secret
- a retry count for cloning
- a volume 'docker' that can be used to access the Docker socket
Args:
name: controls the pipeline name.
edition: used to differentiate the pipeline for enterprise builds.
trigger: a Drone trigger for the pipeline.
steps: the Drone steps for the pipeline.
services: auxilliary services used during the pipeline.
Defaults to [].
platform: abstracts platform specific configuration primarily for different Drone behavior on Windows.
Defaults to 'linux'.
depends_on: list of pipelines that must have succeeded before this pipeline can start.
Defaults to [].
environment: environment variables passed through to pipeline steps.
Defaults to None.
volumes: additional volumes available to be mounted by pipeline steps.
Defaults to [].
Returns:
Drone pipeline
"""
if platform != "windows":
platform_conf = {
'platform': {'os': 'linux', 'arch': 'amd64'},
"platform": {"os": "linux", "arch": "amd64"},
# A shared cache is used on the host
# To avoid issues with parallel builds, we run this repo on single build agents
'node': {'type': 'no-parallel'},
"node": {"type": "no-parallel"},
}
else:
platform_conf = {
'platform': {
'os': 'windows',
'arch': 'amd64',
'version': '1809',
}
"platform": {
"os": "windows",
"arch": "amd64",
"version": "1809",
},
}
pipeline = {
'kind': 'pipeline',
'type': 'docker',
'name': name,
'trigger': trigger,
'services': services,
'steps': steps,
'clone': {
'retries': 3,
"kind": "pipeline",
"type": "docker",
"name": name,
"trigger": trigger,
"services": services,
"steps": steps,
"clone": {
"retries": 3,
},
'volumes': [
"volumes": [
{
'name': 'docker',
'host': {
'path': '/var/run/docker.sock',
"name": "docker",
"host": {
"path": "/var/run/docker.sock",
},
}
},
],
'depends_on': depends_on,
'image_pull_secrets': [pull_secret],
"depends_on": depends_on,
"image_pull_secrets": [pull_secret],
}
if environment:
pipeline.update(
{
'environment': environment,
}
"environment": environment,
},
)
pipeline['volumes'].extend(volumes)
pipeline["volumes"].extend(volumes)
pipeline.update(platform_conf)
if edition in ('enterprise', 'enterprise2'):
if edition in ("enterprise", "enterprise2"):
# We have a custom clone step for enterprise
pipeline['clone'] = {
'disable': True,
pipeline["clone"] = {
"disable": True,
}
return pipeline
def notify_pipeline(
name, slack_channel, trigger, depends_on=[], template=None, secret=None
):
name,
slack_channel,
trigger,
depends_on = [],
template = None,
secret = None):
trigger = dict(trigger)
return {
'kind': 'pipeline',
'type': 'docker',
'platform': {
'os': 'linux',
'arch': 'amd64',
"kind": "pipeline",
"type": "docker",
"platform": {
"os": "linux",
"arch": "amd64",
},
'name': name,
'trigger': trigger,
'steps': [
"name": name,
"trigger": trigger,
"steps": [
slack_step(slack_channel, template, secret),
],
'clone': {
'retries': 3,
"clone": {
"retries": 3,
},
'depends_on': depends_on,
"depends_on": depends_on,
}
# TODO: this overrides any existing dependencies because we're following the existing logic
# it should append to any existing dependencies
def with_deps(steps, deps=[]):
def with_deps(steps, deps = []):
for step in steps:
step['depends_on'] = deps
step["depends_on"] = deps
return steps

View File

@ -1,97 +1,97 @@
pull_secret = 'dockerconfigjson'
drone_token = 'drone_token'
prerelease_bucket = 'prerelease_bucket'
gcp_upload_artifacts_key = 'gcp_upload_artifacts_key'
azure_sp_app_id = 'azure_sp_app_id'
azure_sp_app_pw = 'azure_sp_app_pw'
azure_tenant = 'azure_tenant'
"""
This module returns functions for generating Drone secrets fetched from Vault.
"""
pull_secret = "dockerconfigjson"
drone_token = "drone_token"
prerelease_bucket = "prerelease_bucket"
gcp_upload_artifacts_key = "gcp_upload_artifacts_key"
azure_sp_app_id = "azure_sp_app_id"
azure_sp_app_pw = "azure_sp_app_pw"
azure_tenant = "azure_tenant"
def from_secret(secret):
return {'from_secret': secret}
return {"from_secret": secret}
def vault_secret(name, path, key):
return {
'kind': 'secret',
'name': name,
'get': {
'path': path,
'name': key,
"kind": "secret",
"name": name,
"get": {
"path": path,
"name": key,
},
}
def secrets():
return [
vault_secret(pull_secret, 'secret/data/common/gcr', '.dockerconfigjson'),
vault_secret('github_token', 'infra/data/ci/github/grafanabot', 'pat'),
vault_secret(drone_token, 'infra/data/ci/drone', 'machine-user-token'),
vault_secret(prerelease_bucket, 'infra/data/ci/grafana/prerelease', 'bucket'),
vault_secret(pull_secret, "secret/data/common/gcr", ".dockerconfigjson"),
vault_secret("github_token", "infra/data/ci/github/grafanabot", "pat"),
vault_secret(drone_token, "infra/data/ci/drone", "machine-user-token"),
vault_secret(prerelease_bucket, "infra/data/ci/grafana/prerelease", "bucket"),
vault_secret(
gcp_upload_artifacts_key,
'infra/data/ci/grafana/releng/artifacts-uploader-service-account',
'credentials.json',
"infra/data/ci/grafana/releng/artifacts-uploader-service-account",
"credentials.json",
),
vault_secret(
azure_sp_app_id,
'infra/data/ci/datasources/cpp-azure-resourcemanager-credentials',
'application_id',
"infra/data/ci/datasources/cpp-azure-resourcemanager-credentials",
"application_id",
),
vault_secret(
azure_sp_app_pw,
'infra/data/ci/datasources/cpp-azure-resourcemanager-credentials',
'application_secret',
"infra/data/ci/datasources/cpp-azure-resourcemanager-credentials",
"application_secret",
),
vault_secret(
azure_tenant,
'infra/data/ci/datasources/cpp-azure-resourcemanager-credentials',
'tenant_id',
"infra/data/ci/datasources/cpp-azure-resourcemanager-credentials",
"tenant_id",
),
# Package publishing
vault_secret(
'packages_gpg_public_key',
'infra/data/ci/packages-publish/gpg',
'public-key-b64',
"packages_gpg_public_key",
"infra/data/ci/packages-publish/gpg",
"public-key-b64",
),
vault_secret(
'packages_gpg_private_key',
'infra/data/ci/packages-publish/gpg',
'private-key-b64',
"packages_gpg_private_key",
"infra/data/ci/packages-publish/gpg",
"private-key-b64",
),
vault_secret(
'packages_gpg_passphrase',
'infra/data/ci/packages-publish/gpg',
'passphrase',
"packages_gpg_passphrase",
"infra/data/ci/packages-publish/gpg",
"passphrase",
),
vault_secret(
'packages_service_account',
'infra/data/ci/packages-publish/service-account',
'credentials.json',
"packages_service_account",
"infra/data/ci/packages-publish/service-account",
"credentials.json",
),
vault_secret(
'packages_access_key_id',
'infra/data/ci/packages-publish/bucket-credentials',
'AccessID',
"packages_access_key_id",
"infra/data/ci/packages-publish/bucket-credentials",
"AccessID",
),
vault_secret(
'packages_secret_access_key',
'infra/data/ci/packages-publish/bucket-credentials',
'Secret',
"packages_secret_access_key",
"infra/data/ci/packages-publish/bucket-credentials",
"Secret",
),
vault_secret(
'aws_region',
'secret/data/common/aws-marketplace',
'aws_region',
"aws_region",
"secret/data/common/aws-marketplace",
"aws_region",
),
vault_secret(
'aws_access_key_id',
'secret/data/common/aws-marketplace',
'aws_access_key_id',
"aws_access_key_id",
"secret/data/common/aws-marketplace",
"aws_access_key_id",
),
vault_secret(
'aws_secret_access_key',
'secret/data/common/aws-marketplace',
'aws_secret_access_key',
"aws_secret_access_key",
"secret/data/common/aws-marketplace",
"aws_secret_access_key",
),
]

View File

@ -1,17 +1,20 @@
"""
This module returns the pipeline used for version branches.
"""
load(
'scripts/drone/events/release.star',
'oss_pipelines',
'enterprise_pipelines',
'enterprise2_pipelines',
"scripts/drone/events/release.star",
"enterprise2_pipelines",
"enterprise_pipelines",
"oss_pipelines",
)
ver_mode = 'release-branch'
trigger = {'ref': ['refs/heads/v[0-9]*']}
ver_mode = "release-branch"
trigger = {"ref": ["refs/heads/v[0-9]*"]}
def version_branch_pipelines():
return (
oss_pipelines(ver_mode=ver_mode, trigger=trigger)
+ enterprise_pipelines(ver_mode=ver_mode, trigger=trigger)
+ enterprise2_pipelines(ver_mode=ver_mode, trigger=trigger)
oss_pipelines(ver_mode = ver_mode, trigger = trigger) +
enterprise_pipelines(ver_mode = ver_mode, trigger = trigger) +
enterprise2_pipelines(ver_mode = ver_mode, trigger = trigger)
)