We are looking for Data Engineers to join the team of our client – a reference company in the financial sector.
What will be your main tasks and responsibilities?
* Building vital data pipelines and tooling to futureprof our back office business domain;
* Developing the company's Data Platform, adding new data sources and transformations in response to new business requirements;
* Continually improving the platform, adopting new technologies and patterns as they become available;
* Building internal tooling to support the company's migration into data platforms;
* Continue migrating the existing platform into our Lakehouse.
What is required from you?
* Bachelor's degree in IT, Computer Science, Engineering, Data Science, or related field;
* Minimum 02 years of experience in data engineering roles;
* Experience working with Python in a data engineering capacity and with SQ;
* Experience in building and deploying containerised applications on Kubernetes;
* Familiarity with Snowflake, DBT and building DAGs with Apache Airflow;
* Knowledge in AWS or other public cloud experience, infrastructure-as-code tooling such as Terraform;
* Experience working in end-to-end delivery teams and with a Data Mesh strategy;
* Fluency in English (both written and spoken).
Sounds like you? Send us your CV and let's talk