-
I decided to go with MAGE as orchestrator tool because this is incredible easy to use and also makes a good modularization for the execution plan.
-
For the scripting part I used python, because it was the required programming language and it has great compatibility with orchestration tools as Airflow, Mage, Dagster.
-
For Storage, I use timescaleDB as in this project is required to use TIME SERIES data, which I pull from alphavantage API. This type of DB are by default configured to partition and index records by a timestamp column so at the moment it was not necessary to make indexing or clustering.
-
I choose DBeaver to interact with the database.
Grab an API KEY from advantage data:
- https://www.alphavantage.co/support/#api-key
- Save this for setting up in mage secrets
- For this project you will need the following variables:
POSTGRES_SCHEMA
POSTGRES_USER
POSTGRES_PASSWORD
POSTGRES_HOST
PG_HOST_PORT
POSTGRES_DB
- open terminal
- change direcotry to the path where you clone the repo
- run the docker compose
docker compose up
- after docker compose finishes go to
http://{port}:6789/pipelines
For setting up the secrets you need to go:
- 1 to edit pipeline
- 2 to secrets
- 3 start adding
- Documentation
- Here are all the SECRETS needed:
- Go to the dir where you have the docker-compose.yml
- There should be two new dirs:
- mage_data
- magic
- copy all the files inside financialtimeseries into the magic folder
cp ./financialtimeseries/* ./magic/
- You now should see the pipeline in the browser
- click on the pipeline name and it would navigate to the triggers section
- hit on Run @once and in the pop-up run now, after a while you should see:
- 1 PullData
- 2 PrepareData
- 3 InsertToDB
- 4 Analytics





