We are looking for a highly skilled Tech Lead to spearhead the migration of pipelines from Snowflake to Databricks .
The ideal candidate should have extensive experience in data engineering, ETL orchestration, and database management, with strong proficiency in programming and distributed computing.Key Responsibilities:? Lead the migration of data pipelines from Snowflake to Databricks .Design, develop, and optimize ETL workflows and data pipelines .Collaborate with cross-functional teams to understand database requirements and ensure successful migration.Implement best practices for data engineering and ensure high performance and reliability of data systems.Identify opportunities to optimize and reduce costs associated with data storage and processing.Skills:? Very good level of English .3 years of professional experience in data engineering, business intelligence, or a similar role.Proficiency in programming languages such as Python .Over 5 years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie, and Azkaban using AWS or GCP .? Expertise in database fundamentals, Tableau or SQL, and distributed computing.3 years of experience with distributed data ecosystems (Spark, Hive, Druid, Presto).? Experience working with Snowflake, Redshift, PostgreSQL, Tableau and/or other DBMS platforms .Lead and mentor a team of engineers, fostering a collaborative and productive work environment.Apply Scrum methodologies to manage project workflows and deliverables efficiently.Ready to take the lead?
Apply now and be part of our innovative team!