Job Summary
We are seeking a highly skilled Data Architect to design, build, and maintain scalable data pipelines and ETL processes on AWS.
This is an excellent opportunity for an experienced professional to leverage their expertise in data engineering and join our team of talented data experts.
* Dream big. Innovate fast. Deliver value to customers through data-driven insights.
-----------------------------------
Key Responsibilities
* Design, build, and maintain scalable data pipelines and ETL processes on AWS.
* Integrate, clean, and transform large datasets from multiple sources.
* Optimize data storage and performance using AWS services like S3, Redshift, and Glue.
* Collaborate with data scientists and analysts to deliver reliable, high-quality data.
-----------------------------------
Requirements
* Proficiency in Python and SQL for data processing and automation.
* Strong knowledge of AWS data services (S3, Glue, Redshift, Lambda, EMR).
* Experience with ETL orchestration tools (Airflow, Step Functions).
* Understanding of data modeling, warehousing, and performance optimization.