Job Title: Data Engineer
At our organization, we're seeking a skilled Data Engineer with AWS expertise to design and build data pipelines and analytics platforms using Amazon Web Services (AWS) infrastructure.
Main Responsibilities:
* Designing and building scalable data pipelines using AWS S3, Glue, Lambda, and Kinesis.
* Data modeling and architecture in Snowflake, ensuring efficient data processing and storage.
* Performance tuning and cost optimization in AWS and Snowflake, minimizing waste and maximizing ROI.
* Automation with Apache Airflow, AWS Step Functions, or Snowflake Tasks, streamlining workflows and reducing manual errors.
* Data governance and security using Snowflake RBAC or AWS IAM, protecting sensitive information and maintaining compliance.
* Collaboration with cross-functional teams, including data scientists, analysts, and engineering teams, to drive business outcomes.
* Developing data quality standards, procedures, and reporting, ensuring high-quality data and actionable insights.
Requirements:
* 5-7 years of experience in Data Engineering, with large-scale projects and complex data architectures.
* Experience with AWS Step Functions, utilizing its features for workflow automation.
* Snowflake expertise, with proficiency in data modeling, performance tuning, and cost optimization.
* Strong Python skills, with experience in scripting and automation.
* SQL expertise, with knowledge of database management and querying.
* Apache Airflow, with experience in workflow automation and scheduling.
* Snowflake Tasks, with knowledge of task automation and orchestration.
* Snowflake Certification, with preference given to certified professionals.
* AWS Certification, with preference given to certified professionals.
* Excellent communication and collaboration skills, with ability to work effectively in a team environment.
],