About the Role
This key member of our Activity Analytics Platform will work closely with business experts to create streaming and batched pipelines, consolidating event data and using algorithms for identification purposes.
* Project planning with senior data engineering team members.
* Creating single sources of truth for activity data.
* Implementing technical and business data validation rules.
* Exposing Data Products via APIs and dimensional models to accelerate analyses.
* Improving best practices for a reliable delivery process.
* Assisting in MLOps pipeline creation and upkeep.
We're looking for someone who is experienced in data engineering or similar roles, fluent in English, and has strong SQL skills including T-SQL and Snowflake knowledge. Additional requirements include experience with Pentaho Data Integration, Kettle, or Apache Hop as well as knowledge of modern data architectures, streaming, or real-time pipelines with technologies like Kafka, Kinesis, or Snowflake Dynamic Tables. Good communication skills and experience in data requirement gathering and analysis are also important.
Responsibilities
1. Design and develop efficient data pipelines.
2. Collaborate with cross-functional teams to drive digital transformation.
3. Develop and maintain high-quality code.
4. Optimize data processing for faster insights.
5. Identify opportunities for improvement.
We offer a dynamic work environment and opportunities for growth and development. If you're passionate about data engineering and want to be part of a forward-thinking team, please apply.