many things

This commit is contained in:
Pablo Martin 2024-01-18 17:25:41 +01:00
parent 9d70ee45ec
commit 426f0fbb0d
10 changed files with 103 additions and 21 deletions

12
.vscode/settings.json vendored
View file

@ -1,5 +1,13 @@
{ {
"sql-formatter.uppercase": true,
"dbt.queryLimit": 500, "dbt.queryLimit": 500,
"dbt.enableNewQueryPanel": true "dbt.enableNewQueryPanel": true,
"emeraldwalk.runonsave": {
"commands": [
{
"match": ".*\\.sql(\\.jinja)?",
"isAsync": true,
"cmd": "sqlfmt ${file}"
}
]
}
} }

View file

@ -1,16 +1,67 @@
Welcome to your new dbt project! # DWH dbt
### Using the starter project Welcome to Superhog's DWH dbt project. Here we model the entire DWH.
Try running the following commands: ## How to set up your environment
- dbt run - Pre-requisites
- dbt test - You need a Linux environment. That can be Linux, macOS or WSL.
- You need to install Python `>=3.10` and `poetry`.
- All docs will assume you are using VSCode.
- Prepare SSH tunnels
- We currently use SSH tunnels to reach both the `dev` and `prd` instances. You can ask Pablo how to set these up.
- You will need to activate the tunnels in order to run the dbt models on the databases. It will probably pay off to make them easy to activate in your terminal, you can make an alias.
- Set up
- Create an entry for this project `profiles.yml` file at `~/.dbt/profiles.yml`. You have a suggested template at `profiles.yml.example`
- Make sure that the `profiles.yml` host and port settings are consistent with the tunnels.
- Use `poetry install` to get dependencies in place.
- Check
- Ensure you are running in the project venv, either by setting VSCode Python interpreter to the one created by `poetry`, or by running `poetry shell` in the console when in the root dir.
- Turn on your tunnel to `dev` and run `dbt debug`. If it runs well, you are all set. If it fails, there's something wrong with your set up. Grab the terminal output and pull the thread.
- Complements
- If you are in VSCode, you most probably want to have this extension installed: [dbt Power User](https://marketplace.visualstudio.com/items?itemName=innoverio.vscode-dbt-power-user)
- It is advised to use [this autoformatter](https://sqlfmt.com/) and to automatically [run it on save](https://docs.sqlfmt.com/integrations/vs-code).
### Resources ## Branching strategy
- Learn more about dbt [in the docs](https://docs.getdbt.com/docs/introduction) This repo works in a trunk-based-development philosophy (<https://trunkbaseddevelopment.com/>).
- Check out [Discourse](https://discourse.getdbt.com/) for commonly asked questions and answers
- Join the [chat](https://community.getdbt.com/) on Slack for live discussions and support Open a feature branch (`feature/your-branch-name`) for any changes and make it short-lived. It's fine and encouraged to build incrementally towards a `mart` level table with multiple PRs as long as you keep the model buildable along the way.
- Find [dbt events](https://events.getdbt.com) near you
- Check out [the blog](https://blog.getdbt.com/) for the latest news on dbt's development and best practices ## Project organization
We organize models in four folders:
- `sync`
- Dedicated to sources.
- One `.yml` per `sync` schema.
- No SQL models go here.
- `staging`
- Pretty much this: <https://docs.getdbt.com/best-practices/how-we-structure/2-staging>
- All models go prefixed with `stg_`.
- Avoid `SELECT *`. We don't know what dirty stuff can come from the `sync` schemas.
- `intermediate`
- Pretty much this: <https://docs.getdbt.com/best-practices/how-we-structure/3-intermediate>
- It's strictly forbidden to use tables here to end users.
- Make an effort to practice DRY.
- `reporting`
- Pretty much this: <https://docs.getdbt.com/best-practices/how-we-structure/4-marts>
- For now, we follow a monolithic approach and just have one `reporting` schema. When this becomes insufficient, we will judge splitting into several schemas.
- Make an effort to keep this layer stable like you would do with a library's API so that downstream dependencies don't break without control.
## Conventions
- Always use CTEs in your models to `source` and `ref` other models.
- We follow [snake case](https://en.wikipedia.org/wiki/Snake_case).
- Identifier columns should begin with `id_`, not finish with `_id`.
- Use binary question-like column names for binary, bool, and flag columns (i.e. not `active` but `is_active`, not `verified` but `has_been_verified`, not `imported` but `was_imported`)
- Datetime columns should either finish in `_utc` or `_local`. If they finish in local, the table should contain a `local_timezone` column that contains the [timezone identifier](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones).
- We work with many currencies and lack a single main once. Hence, any money fields will be ambiguous on their own. To address this, any table that has money related columns should also have a column named `currency`. We currently have no policy for tables where a single record has columns in different currencies. If you face this, assemble the data team and decide on something.
## Stuff that we haven't done but we would like to
- Automate formatting with git pre-commit.
- Define conventions on testing (and enforce them).
- Define conventions on documentation (and enforce them).
- Replace SSH tunneling with a Wireguard VPN access.
- Prepare a quick way to replicate parts of the `prd` dwh in our local machines.

View file

@ -33,7 +33,7 @@ clean-targets: # directories to be removed by `dbt clean`
models: models:
dwh_dbt: dwh_dbt:
staging: staging:
+materialized: table +materialized: view
+schema: staging +schema: staging
working: working:
+materialized: view +materialized: view

View file

@ -0,0 +1,4 @@
with int_core_unified_user as (select * from {{ ref("int_core_unified_user") }})
select id_deal, count(1) as users_with_this_id_deal
from int_core_unified_user
group by id_deal

View file

@ -0,0 +1,3 @@
with int_core_deal_id_master_list as (select * from {{ ref("int_core_deal_id_master_list") }})
select *
from int_core_deal_id_master_list

View file

@ -1,3 +0,0 @@
with wkg_core_deal_id_master_list as (select * from {{ ref("wkg_core_deal_id_master_list") }})
select *
from wkg_core_deal_id_master_list

View file

@ -1,4 +0,0 @@
with wkg_core_unified_user as (select * from {{ ref("wkg_core_unified_user") }})
select id_deal, count(1) as users_with_this_id_deal
from wkg_core_unified_user
group by id_deal

23
profiles.yml.example Normal file
View file

@ -0,0 +1,23 @@
dwh_dbt:
outputs:
dev:
dbname: dwh
host: localhost
user: <your-user>
pass: <your-password>
port: <the local port for your ssh tunnel>
schema: working
threads: 4
type: postgres
prd:
dbname: dwh
host: localhost
user: <prd-dbt-user>
pass: <prd-dbt-password>
port: <the local port for your ssh tunnel>
schema: working
threads: 4
type: postgres
target: dev