Data Engineer
Are you passionate about building scalable, reliable data pipelines? Join a fast-growing European tech scale-up that is reshaping the way financial institutions manage and analyse complex data flows. We are looking for a talented Data Engineer to help architect and deliver data infrastructure that powers critical business insights and compliance workflows.
What You’ll Do:
* Design, build, and optimise robust ETL/ELT pipelines handling large volumes of structured and unstructured data
* Work closely with cross-functional teams, including product, backend, and cloud infrastructure to deploy new data features and integrations
* Ensure high data quality, integrity, and security in line with regulatory requirements
* Monitor and improve data system performance, scalability, and cost-efficiency
* Influence data architecture decisions and best practices across the engineering team
Your Skills and Experience:
* 2+ years as a Data Engineer or similar role, preferably in a fast-paced environment
* Strong programming skills in Python and experience with data processing frameworks such as Spark, Dask, or Apache Airflow
* Hands-on experience with streaming technologies like Kafka or Flink is highly desirable
* Proficiency in SQL and experience with cloud data warehouses (BigQuery, Snowflake, Redshift)
* Familiarity with cloud platforms (AWS preferred) and infrastructure-as-code tools such as Terraform
* Excellent analytical and problem-solving skills, with attention to detail and a proactive mindset
Bonus Points For:
* Experience working with financial data, regulatory compliance, or risk management
* Knowledge of data privacy regulations (e.g., GDPR, MiFID II)
* Containerization and orchestration experience (Docker, Kubernetes)