Published on

Slack notification for BigQuery results using GitHub Actions

Authors
  • avatar
    Name
    ZoruChan
    Twitter

Slack notification for BigQuery results using GitHub Actions

Have you ever felt like this?

I want to monitor my data in BQ, but don’t want to write code or set up infra.

This article is for you.

Solution

Why not use GitHub actions to run a script which does exactly that? I built a GitHub action script just for this, feel free to use it:

https://github.com/marketplace/actions/bigquery-sql-execution-with-slack-notification-for-results

Setup

✅ GitHub Actions can execute code using a preset schedule Great, GitHub does the work for us, no need to set up any Scheduled Query, PubSub, Notification channel, etc. ✅ Service Account for GCP with sufficient privileges to execute BQ queries I recommend to save the contents of your Service Account JSON as a repository secret ✅ You will need a Slack Webhook URL Just use incoming webhook, no need to overcomplicate ✅ Write your SQL ✅ Set up your workflow YAML

Here is a sample:

name: Run SQL against BQ and notify slack
on:
  schedule:
    - cron: '0 10 * * *' # run every day at 10:00
jobs:
  bq2slack:
    runs-on: ubuntu-latest
    name: Execute SQL query and send results to Slack
    steps:
      - name: Run
        uses: data-i-consulting/bq2slack-github-action@v1.0.6
        with:
          slack_webhook: ${{secrets.SLACK_WEBHOOK}}
          gcp_service_account: ${{secrets.GCP_SERVICE_ACCOUNT}}
          sql: |
            WITH something AS (
              SELECT 1 as my_number
            )
            SELECT * FROM something

That’s it, you will get a slack message with your query results each time the workflow executes.

bq2slack

Hope you’ll love this simple, easy and lazy solution as much as I do. Cheers!