Job Information
Date Opened
20/01/2026
Job Type
Full time
Remote Job
Industry
IT Services
Job Description
This is a remote position.
We are looking for a Senior Databricks Specialist + Python to design, evolve, and govern a common reusable Python library that serves as a core foundation for batch and streaming pipelines across the Medallion Architecture (Bronze, Silver, Gold).
Work can be done remotely and the communication with the team is in Portuguese and English.
Requirements
* Minimum 10 years of professional experience developing in Python.
* At least 5 years of hands-on experience with Databricks, including PySpark development in production environments.
* Proven experience working as a member of Scrum or Agile teams.
* Solid experience designing Python libraries, frameworks, or shared components.
* Strong knowledge of software engineering best practices, including:
o Object-Oriented Programming (OOP)
o Design patterns
o Unit and integration testing
o CI/CD pipelines
* Experience with code standardization and quality tools, such as linting and formatting tools (e.g., pylint, flake8, black or equivalent).
* Strong understanding of batch and streaming data processing.
* Experience with Medallion Architecture and data lifecycle best practices.
* Familiarity with Airflow, Terraform, and Azure ADLS Gen2.
* Professional working proficiency in English, both written and spoken.
Key Responsibilities
* Design, implement, and maintain a shared Python library for Databricks, supporting batch and streaming pipelines.
* Develop reusable PySpark modules, base classes, and abstractions for Bronze, Silver, and Gold layers.
* Actively participate as a Scrum team member in Sprint Planning, Daily, Refinement, Review, and Retrospective ceremonies.
* Define and enforce software engineering best practices, including coding standards, documentation, testing strategies, and versioning.
* Establish and maintain code quality standards, including linting, formatting, and static analysis.
* Collaborate with Product Owners and fellow engineers to clarify requirements and deliver incremental value.
* Maintain and improve CI/CD pipelines using GitLab and Databricks Asset Bundles (DABs).
* Ensure controlled releases, backward compatibility, and smooth adoption of the common library across teams.
* Integrate logging, monitoring, and data quality controls using Grafana and DQX.
* Work closely with DataOps to ensure stability, observability, and reliability in production environments.
Desired Qualifications
* Databricks Certified Associate or Professional certification.
* Microsoft Azure Fundamentals (AZ-900) or equivalent basic Azure certification.
* Experience contributing to shared platforms or internal frameworks used by multiple teams.
* Experience working in international or distributed environments.
Platform & Infrastructure
Azure Databricks | Airflow (AKS) | Terraform | Azure ADLS Gen2 | GitLab CI/CD | Databricks Asset Bundles (DABs) | Kafka (Aiven) | Grafana | DQX (Data Quality Framework) | Python | PySpark | SQL
If it sounds like you, share your CV with us and let's talk