lolafect/README.md

103 lines
3.5 KiB
Markdown
Raw Normal View History

2022-12-29 18:08:34 +01:00
# Lolafect
2023-01-09 13:51:16 +01:00
Lolafect is a collection of Python bits that help us build our Prefect flows.
2023-01-09 16:29:38 +01:00
## Quickstart
You can find below examples of how to leverage `lolafect` in your flows.
**Let the `LolaConfig` object do the boilerplate env stuff for you**
```python
from lolafect.lolaconfig import build_lolaconfig
2023-01-18 17:20:54 +01:00
lolaconfig = build_lolaconfig(flow_name="some-flow")
2023-01-09 16:29:38 +01:00
# Now you can access all the env stuff from here
lolaconfig.FLOW_NAME
lolaconfig.FLOW_NAME_UDCS
lolaconfig.STORAGE
lolaconfig.KUBERNETES_IMAGE
lolaconfig.KUBERNETES_LABELS
lolaconfig.SLACK_WEBHOOKS
lolaconfig.DW_CREDENTIALS
lolaconfig.TRINO_CREDENTIALS
lolaconfig.SSH_TUNNEL_CREDENTIALS
lolaconfig.PREFECT_HOST
2023-01-09 16:29:38 +01:00
# etc
2023-01-18 17:20:54 +01:00
# Your flow is different from the typical one?
# You can customize the behaviour of LolaConfig
lolaconfig = build_lolaconfig(
flow_name="some-flow",
env_s3_bucket="my-odd-bucket",
kubernetes_labels=["some-unusual-label"],
kubernetes_image="the-image:not-the-production-one",
)
2023-01-09 16:29:38 +01:00
```
2023-01-23 14:27:17 +01:00
**Connect to a Trino server**
```python
from lolafect.connections import connect_to_trino, close_trino_connection
2023-01-23 14:37:12 +01:00
with Flow(...) as flow:
2023-01-23 14:27:17 +01:00
connection = connect_to_trino.run(
trino_credentials=my_trino_credentials #You can probably try to fetch this from lolaconfig.TRINO_CREDENTIALS
)
2023-01-23 14:37:12 +01:00
task_result = some_trino_related_task(trino_connection=connection)
close_trino_connection.run(
trino_connection=connection,
upstream_tasks=[task_result]
)
2023-01-23 14:27:17 +01:00
```
2023-01-09 16:29:38 +01:00
**Send a warning message to slack if your tasks fails**
```python
from prefect.triggers import any_failed
from lolafect.slack import SendSlackMessageTask
send_warning_message_on_any_failure = SendSlackMessageTask(trigger=any_failed) # You can generate other tasks with
#different triggers. For example, you can send a message when all tasks fail, or all tasks succeed
with Flow(...) as flow:
crucial_task_result = some_crucial_task()
send_warning_message_on_any_failure(
webhook_url="the-channel-webhook", # You can probably try to fetch this from lolaconfig.SLACK_WEBHOOKS
text_to_send="Watchout, the flow failed!",
upstream_tasks=[crucial_task_result]
)
```
2023-01-09 13:51:16 +01:00
## How to test
2023-01-23 14:35:37 +01:00
There are two test suites: unit tests and integration tests. Integration tests are prepared to plug to some of our
AWS resources, hence they are not fully reliable since they require specific credentials and permissions. The
recommended policy is:
- Use the unit tests in any CI process you want.
- Use the unit tests frequently as you code.
- Do not use the integration tests in CI processes.
- Use the integration tests as milestone checks when finishing feature branches.
- Make sure to ensure integration tests are working before making a new release.
When building new tests, please keep this philosophy in mind.
2023-01-09 13:51:16 +01:00
IDE-agnostic:
2023-01-09 16:29:38 +01:00
2023-01-09 13:51:16 +01:00
1. Set up a virtual environment which contains both `lolafect` and the dependencies listed in `requirements-dev.txt`.
2023-01-23 14:35:37 +01:00
2. Run:
- For all tests: `pytests tests`
- Only unit tests: `pytest tests/test_unit`
- Only integration tests: `pytest tests/test_integration`
2023-01-09 13:51:16 +01:00
2023-01-09 16:29:38 +01:00
In Pycharm:
- If you configure `pytest` as the project test runner, Pycharm will most probably autodetect the test
2023-01-23 14:35:37 +01:00
folder and allow you to run the test suite within the IDE. However, Pycharm has troubles running the integration
tests since the shell it runs from does not have the AWS credentials. Hence, for now we recommend you to only use
the Pycharm integrated test runner for the unit tests. You can easily set up a Run Configuration for that.