Skip to main content

Profiles

Deep Channel requires a profiles.yml file to establish a connection to the data warehouse.

Deep Channel will check for a profiles.yml file in four places:

  1. The location of the DBT_PROFILES_DIR environment variable
  2. The root of the dbt project
  3. Any custom path set in your Settings (Settings > dbt > Custom profiles.yml path)
  4. In your home directory, at ~/.dbt/profiles.yml

If you already have a profiles.yml you can skip ahead to Projects. If you need to create or configure your profiles.yml file, keep reading.

Creating a profiles.yml file#

If you don’t have one, you can create a new one in a few minutes. It is recommended to create your profiles.yml file inside the .dbt directory of your home directory.

If you don’t have a .dbt folder in your home directory, just create a new one:

mkdir ~/.dbt

And then create a profiles.yml file:

‍macOS and Linux:

touch ~/.dbt/profiles.yml

‍Windows:

ni ~/.dbt/profiles.yml

‍ Next, use a text editor to populate the ~/.dbt/profiles.yml file based on the dbt Core profiles.yml docs.

Snowflake Profiles#

Deep Channel supports authenticating with Snowflake through:

  • User/Password
  • Key Pair
  • Single Sign On (SSO)

SSO Authentication Caching#

When using the authenticator: externalbrowser configuration, you can enable caching to reduce authentication prompts.

Caching is available on MacOS and Windows.

Consult the following documents to enable caching,

BigQuery Profiles#

Deep Channel supports authenticating with BigQuery through:

note

For optional config fields mentioned below refer to the dbt-bigquery documentation.

OAuth via gcloud#

bigquery_oauth_gcloud:  target: dev  outputs:    dev:      type: bigquery      method: oauth      project: [...]      threads: 1      # one of `dataset` or `schema` must be specified      dataset: [...]      schema: [...]      [optional config fields]: [...]

gcloud Setup#

  1. Install gcloud following the official Google Cloud documentation.
  2. Authenticate using gcloud
    gcloud auth application-default login --scopes=https://www.googleapis.com/auth/bigquery
note

dbt and Deep Channel requires the https://www.googleapis.com/auth/bigquery scope to be able to function; Please ensure you include this scopes as stated in the command above.

OAuth - Refresh token#

my_bigquery_db:  target: dev  outputs:    dev:      type: bigquery      method: oauth-secrets      project: [...]      threads: 1      refresh_token: [...]      client_id: [...]      client_secret: [...]      scopes:        - https://www.googleapis.com/auth/bigquery      token_uri: https://www.googleapis.com/oauth2/v4/token      # one of `dataset` or `schema` must be specified      dataset: [...]      schema: [...]      [optional config fields]: [...]
note

Note you must specify the scopes field as shown above to avoid authentication errors.

OAuth - Temporary token#

my_bigquery_db:  target: dev  outputs:    dev:      type: bigquery      method: oauth-secrets      project: [...]      threads: 1      # one of `dataset` or `schema` must be specified      dataset: [...]      schema: [...]      token: [...]      [optional config fields]: [...]

Notes#

  • Deep Channel supports full
    functionality with a OAuth (temporary token) connection, but no support for syncing or manual queries inside Deep Channel at present

Service Account - JSON#

my_bigquery_db:  target: dev  outputs:    dev:      type: bigquery      method: service-account-json      project: [...]      threads: 1      # one of `dataset` or `schema` must be specified      dataset: [...]      schema: [...]      [optional config fields]: [...]      keyfile_json:        type: [...]        project_id: [...]        private_key_id: [...]        private_key: [...]        client_email: [...]        client_id: [...]        auth_uri: [...]        token_uri: [...]        auth_provider_x509_cert_url: [...]        client_x509_cert_url: [...]

Service Account - file#

my_bigquery_db:  target: dev  outputs:    dev:      type: bigquery      method: service-account      project: deepchannel-integration      # one of `dataset` or `schema` must be specified      dataset: [...]      schema: [...]      keyfile: /path/to/keyfile.json      [optional config fields]: [...]

Databricks Profiles#

my_databricks_db:  target: dev  outputs:    dev:      type: databricks      # optional, available when using dbt >= 1.1.1      catalog: [...]      schema: [...]      host: [...]      http_path: [...]      token: [...]      threads: 1

For additional information about setting up a Databricks connection, please see the official Databricks documentation.

Redshift Profiles#

Deep Channel supports password-based authentication for Redshift.

PostgreSQL#

Deep Channel supports password-based authentication for PostgreSQL.