As a data architect, you will design and develop scalable data platforms to process and analyze large datasets.
* Implement data processing workflows using frameworks like Spark or Apache Beam.
* Collaborate with cross-functional teams to ensure data accuracy, consistency, and availability across the enterprise.
* Develop robust access controls and encryption to guarantee data security and privacy.
To succeed in this role, you must have a solid understanding of distributed systems, modern data architectures, and data modeling principles. You should also be proficient in programming languages such as Python, Java, or Scala, and have experience working with Big Data tools and ETL/integration platforms. Additionally, expertise in cloud-based technologies such as AWS or Azure is highly desirable.
Key Responsibilities:
* Design and implement scalable data pipelines using Spark, Beam, or other big data processing engines.
* Develop and maintain high-performance data processing workflows that meet business requirements.
* Collaborate with data scientists and analysts to integrate data into analytics platforms and applications.
Requirements:
* Strong understanding of distributed systems, data architecture, and data modeling principles.
* Proficiency in programming languages such as Python, Java, or Scala.
* Experience working with Big Data tools and ETL/integration platforms.
Benefits:
* Opportunity to work with cutting-edge technologies and innovative projects.
* Chance to collaborate with experienced data professionals and contribute to the growth of the organization.
* Professional development opportunities, including training and certification programs.
What We Offer:
* A dynamic and collaborative work environment.
* Competitive salary and benefits package.
* Opportunities for career growth and professional development.
We are an equal opportunities employer and welcome applications from diverse candidates.