Profiles
Deep Channel requires a profiles.yml
file to establish a connection to the data warehouse.
Deep Channel will check for a profiles.yml
file in four places:
- The location of the
DBT_PROFILES_DIR
environment variable - The root of the dbt project
- Any custom path set in your Settings (Settings > dbt > Custom profiles.yml path)
- In your home directory, at
~/.dbt/profiles.yml
If you already have a profiles.yml
you can skip ahead to Projects. If you need to create or configure your profiles.yml
file, keep reading.
#
Creating a profiles.yml fileIf you don’t have one, you can create a new one in a few minutes. It is recommended to create your profiles.yml file inside the .dbt
directory of your home directory.
If you don’t have a .dbt
folder in your home directory, just create a new one:
mkdir ~/.dbt
And then create a profiles.yml file:
macOS and Linux:
touch ~/.dbt/profiles.yml
Windows:
ni ~/.dbt/profiles.yml
Next, use a text editor to populate the ~/.dbt/profiles.yml
file based on the dbt Core profiles.yml docs.
#
Snowflake ProfilesDeep Channel supports authenticating with Snowflake through:
- User/Password
- Key Pair
- Single Sign On (SSO)
#
SSO Authentication CachingWhen using the authenticator: externalbrowser
configuration, you can enable
caching to reduce authentication prompts.
Caching is available on MacOS and Windows.
Consult the following documents to enable caching,
- SSO Authentication
- Using Connection Caching to Minimize the Number of Prompts for Authentication — Optional
- ALLOW_ID_TOKEN
#
BigQuery ProfilesDeep Channel supports authenticating with BigQuery through:
- OAuth -
gcloud
- OAuth - Refresh Token
- Service Account - JSON
- Service Account - File
- OAuth - Temporary Token (partial support)
note
For optional config
fields mentioned below refer to the dbt-bigquery documentation.
gcloud
#
OAuth via bigquery_oauth_gcloud: target: dev outputs: dev: type: bigquery method: oauth project: [...] threads: 1 # one of `dataset` or `schema` must be specified dataset: [...] schema: [...] [optional config fields]: [...]
gcloud
Setup#
- Install
gcloud
following the official Google Cloud documentation. - Authenticate using
gcloud
gcloud auth application-default login --scopes=https://www.googleapis.com/auth/bigquery
note
dbt and Deep Channel requires the https://www.googleapis.com/auth/bigquery
scope to be able to function; Please ensure you include this scopes as stated in the command above.
#
OAuth - Refresh tokenmy_bigquery_db: target: dev outputs: dev: type: bigquery method: oauth-secrets project: [...] threads: 1 refresh_token: [...] client_id: [...] client_secret: [...] scopes: - https://www.googleapis.com/auth/bigquery token_uri: https://www.googleapis.com/oauth2/v4/token # one of `dataset` or `schema` must be specified dataset: [...] schema: [...] [optional config fields]: [...]
note
Note you must specify the scopes
field as shown above to avoid authentication errors.
#
OAuth - Temporary tokenmy_bigquery_db: target: dev outputs: dev: type: bigquery method: oauth-secrets project: [...] threads: 1 # one of `dataset` or `schema` must be specified dataset: [...] schema: [...] token: [...] [optional config fields]: [...]
#
Notes- Deep Channel supports full functionality with a OAuth (temporary token) connection, but no support for syncing or manual queries inside Deep Channel at present
#
Service Account - JSONmy_bigquery_db: target: dev outputs: dev: type: bigquery method: service-account-json project: [...] threads: 1 # one of `dataset` or `schema` must be specified dataset: [...] schema: [...] [optional config fields]: [...] keyfile_json: type: [...] project_id: [...] private_key_id: [...] private_key: [...] client_email: [...] client_id: [...] auth_uri: [...] token_uri: [...] auth_provider_x509_cert_url: [...] client_x509_cert_url: [...]
#
Service Account - filemy_bigquery_db: target: dev outputs: dev: type: bigquery method: service-account project: deepchannel-integration # one of `dataset` or `schema` must be specified dataset: [...] schema: [...] keyfile: /path/to/keyfile.json [optional config fields]: [...]
#
Databricks Profilesmy_databricks_db: target: dev outputs: dev: type: databricks # optional, available when using dbt >= 1.1.1 catalog: [...] schema: [...] host: [...] http_path: [...] token: [...] threads: 1
For additional information about setting up a Databricks connection, please see the official Databricks documentation.
#
Redshift ProfilesDeep Channel supports password-based authentication for Redshift.
#
PostgreSQLDeep Channel supports password-based authentication for PostgreSQL.