diff --git a/docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx b/docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx index c6a8aa5749c..3e3fefcfef7 100644 --- a/docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx +++ b/docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx @@ -11,32 +11,21 @@ version: 1 The recommended connector library for BigQuery is [pybigquery](https://github.com/mxmzdlv/pybigquery). -The connection string for BigQuery looks like: - +### Install BigQuery Driver +Follow the steps [here](/docs/databases/dockeradddrivers) about how to +install new database drivers when setting up Superset locally via docker-compose. ``` -bigquery://{project_id} +echo "pybigquery" >> ./docker/requirements-local.txt ``` - -When adding a new BigQuery connection in Superset, you'll also need to add the GCP Service Account +### Connecting to BigQuery +When adding a new BigQuery connection in Superset, you'll need to add the GCP Service Account credentials file (as a JSON). 1. Create your Service Account via the Google Cloud Platform control panel, provide it access to the appropriate BigQuery datasets, and download the JSON configuration file for the service account. - -2. n Superset, Add a JSON blob to the **Secure Extra** field in the database configuration form with - the following format: - +2. In Superset, you can either upload that JSON or add the JSON blob in the following format (this should be the content of your credential JSON file): ``` { - "credentials_info": -} -``` - -The resulting file should have this structure: - -``` -{ - "credentials_info": { "type": "service_account", "project_id": "...", "private_key_id": "...", @@ -48,10 +37,49 @@ The resulting file should have this structure: "auth_provider_x509_cert_url": "...", "client_x509_cert_url": "..." } -} -``` + ``` + +![CleanShot 2021-10-22 at 04 18 11](https://user-images.githubusercontent.com/52086618/138352958-a18ef9cb-8880-4ef1-88c1-452a9f1b8105.gif) + + +3. Additionally, can connect via SQLAlchemy URI instead + + The connection string for BigQuery looks like: + + ``` + bigquery://{project_id} + ``` + Go to the **Advanced** tab, Add a JSON blob to the **Secure Extra** field in the database configuration form with + the following format: + ``` + { + "credentials_info": + } + ``` + + The resulting file should have this structure: + ``` + { + "credentials_info": { + "type": "service_account", + "project_id": "...", + "private_key_id": "...", + "private_key": "...", + "client_email": "...", + "client_id": "...", + "auth_uri": "...", + "token_uri": "...", + "auth_provider_x509_cert_url": "...", + "client_x509_cert_url": "..." + } + } + ``` You should then be able to connect to your BigQuery datasets. +![CleanShot 2021-10-22 at 04 47 08](https://user-images.githubusercontent.com/52086618/138354340-df57f477-d3e5-42d4-b032-d901c69d2213.gif) + + + To be able to upload CSV or Excel files to BigQuery in Superset, you'll need to also add the [pandas_gbq](https://github.com/pydata/pandas-gbq) library.