About UsPlanck Technologies is a company specialized in Software Development, dedicated to shaping futures and creating value through innovative IT solutions.
By expanding teams and offering a comprehensive range of services—from Software Development and Infrastructure Management to Cybersecurity—we empower clients with all the expertise they need in one place.
Inspired by the principles of quantum physics, we push beyond traditional boundaries to deliver customized solutions that redefine the IT landscape and drive shared success.About the jobDesign, build, and optimize batch and streaming data pipelines in Databricks (PySpark, Spark SQL);Implement scalable data transformations aligned with the Medallion Architecture (Bronze, Silver, Gold);Ensure data quality, reliability, and performance through testing and monitoring;Manage data infrastructure using Terraform and GitOps principles;Operate workflows with Airflow on Azure Kubernetes Service (AKS);Collaborate with Data Architects, Project Managers, and stakeholders to align on solutions and delivery;Participate in code reviews and contribute to knowledge sharing within the engineering team;Document workflows, processes, and deployment standards.What are we looking for?
Strong experience with Databricks, PySpark, and Spark SQL;Proven expertise in batch and streaming data processing;Hands-on experience with Azure Data Lake Storage Gen2 (ADLS);Solid knowledge of Airflow, preferably on Kubernetes (AKS);Understanding of Medallion Architecture principles;Familiarity with Terraform and infrastructure-as-code practices;Awareness of data privacy, governance, and security standards.Nice to HaveExperience with Talend and/or Fivetran;Knowledge of Databricks Asset Bundles;Familiarity with Vault, Helm charts, and Kafka monitoring tools.Location: Remote, PortugalWe're waiting for you!