Vagas: data governance analyst
- QintessHome Office in São Paulo, SPEmployerAtiva há 15 dias·
- Promote and enforce best practices in data engineering, data governance, and data quality.
- Evaluate, design, implement and maintain data governance solutions:…
- Ver todas as Vagas de emprego: Qintess – vagas para São Paulo – Vagas de emprego: Data Engineer - São Paulo, SP
- Busca por salário: salários de Sr. Data Engineer Azure Databricks - São Paulo, SP
- Confira as principais perguntas sobre a empresa Qintess
- PerconaHome Office in São Paulo, SP·
- Enforce data standards and governance policies to ensure accuracy across systems.
- Mentor junior analysts and lead initiatives to standardize best practices.
- Ver todas as Vagas de emprego: Percona – vagas para São Paulo – Vagas de emprego: Revenue Analyst - São Paulo, SP
- Busca por salário: salários de Revenue Operations Analyst Manager (Remote)
- WorldpaySão Paulo, SP·
- You will be responsible for leading a team of analysts ensuing compliance with local and international regulations.
- CI&THome Office·
- Experience in product management or data analysis.
- Knowledge of machine learning concepts and data validation processes.
- Ver todas as Vagas de emprego: CI&T – vagas para Remoto – Vagas de emprego: Software Architect - Remoto
- Busca por salário: Systems Architect , Brazil - Remoto|textlinkEl]
- Confira as principais perguntas sobre a empresa CI&T
- ZipdevHome OfficeNormalmente responde em até 1 dia·
- In addition to the technical responsibilities, you will coach and mentor your fellow data analysts and data engineers.
- Cloud: Amazon Web Services (AWS).
- Ver todas as Vagas de emprego: Zipdev – vagas para Remoto – Vagas de emprego: AI/ML Engineer - Remoto
- Busca por salário: salários de Staff AI/ML Applied Engineer - Remoto
- Landis+GyrCuritiba, PR·
- Landis+Gyr is implementing a Global Master Data Management (MDM) tool.
- The MDM Tool Analyst will be responsible for implementing, integrating, managing, and…
- Ver todas as Vagas de emprego: Landis+Gyr – vagas para Curitiba – Vagas de emprego: Administrator - Curitiba, PR
- Busca por salário: salários de MDM Tool Administrator
- Confira as principais perguntas sobre a empresa Landis+Gyr
- RecargaPayHome OfficeEmployerAtiva há 23 dias·
- They ensure data governance, process optimization, and system integration, contributing to data-driven decision-making.
- Competitive and market-aligned salary.
- Ver todas as Vagas de emprego: RecargaPay – vagas para Remoto – Vagas de emprego: Data Engineer - Remoto
- Busca por salário: salários de Sr Data Engineer - Remoto
- Confira as principais perguntas sobre a empresa RecargaPay
- SalesforceSão Paulo, SP·
- Leverage data to drive insightful analysis that enables the execution of LCA business objectives and AI-driven operational efficiencies.
- Ver todas as Vagas de emprego: Salesforce – vagas para São Paulo – Vagas de emprego: Senior Salesforce Administrator - São Paulo, SP
- Busca por salário: salários de Senior Technical Business Analyst/Salesforce Admin
- Confira as principais perguntas sobre a empresa Salesforce
- Grupo QuintoAndarHome OfficeNormalmente responde em até 1 dia·
- Advocating for the value of data analytics and engineering within the organization and fostering a data-driven culture.
- Be a technical reference for your team;
- Ver todas as Vagas de emprego: Grupo QuintoAndar – vagas para Remoto – Vagas de emprego: Data Engineer - Remoto
- Busca por salário: salários de Grupo QuintoAndar | Data Engineering Manager - Remoto
- Confira as principais perguntas sobre a empresa Grupo QuintoAndar
- DiDi GlobalSão Paulo, SP·
- In this role, you will work on cross-functional strategic projects, and will be responsible for supporting data analyses that will help shape our business…
- Ver todas as Vagas de emprego: DiDi Global – vagas para São Paulo – Vagas de emprego: Product Lifecycle Analyst - São Paulo, SP
- Busca por salário: salários de User Lifecycle Senior Analyst
- DocuSignSão Paulo, SP·
- Design reporting cadence, governance and data strategy for customer success strategic programs.
- Collaborate with Compensation Analytics team on strategic data…
- Ver todas as Vagas de emprego: DocuSign – vagas para São Paulo – Vagas de emprego: Operations Lead - São Paulo, SP
- Busca por salário: salários de CS Strategy & Operations Lead - São Paulo, SP
- Confira as principais perguntas sobre a empresa DocuSign
- Fitch RatingsSão Paulo, SPNormalmente responde em até 3 dias·
- Work closely with internal/external counterparts (issuers, bankers, SuF analysts, etc.).
- Associate Director, ESG/Sustainable Finance- Business & Relationship…
- Visualizar todas as Vagas de emprego: Fitch Ratings - empregos: São Paulo
- Busca por salário: salários de Associate Director - São Paulo, SP
- Kraft Heinz CompanySão Paulo, SPNormalmente responde em até 1 dia·
- Highly numerate and financially aware to allow project financial analysis to be completed in conjunction with financial analysts.
- Ver todas as Vagas de emprego: Kraft Heinz Company – vagas para São Paulo – Vagas de emprego: Director of Engineering - São Paulo, SP
- Busca por salário: salários de Engineering Director - São Paulo, SP
- Confira as principais perguntas sobre a empresa Kraft Heinz Company
Exibir vagas similares dessa empresaDiDi GlobalSão Paulo, SP·- In this role, you will work on cross-functional strategic projects, and will be responsible for supporting data analyses that will help shape our business…
- Visualizar todas as Vagas de emprego: DiDi Global - empregos: São Paulo
- Busca por salário: salários de Business Analyst - São Paulo, SP
- Louis Dreyfus CompanySão Paulo, SPEmployerAtiva há 7 dias·
- Participate in the data migration validation.
- Work with business analysts (BA) to specify the requirements, review the Product Backlog items (PBI) incl…
- Visualizar todas as Vagas de emprego: Louis Dreyfus Company - empregos: São Paulo
- Busca por salário: salários de Consultor Processos Execuçao (Transportations)
- Confira as principais perguntas sobre a empresa Louis Dreyfus Company
Job Post Details
Sr. Data Engineer Azure Databricks
Dados da vaga
Tipo de vaga
- Tempo integral
Descrição completa da vaga
This is a remote, contract position responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics).
We are looking for a skilled Senior Data Engineer with a strong background in Python, SQL, PySpark, Azure, Databricks, Synapse, Azure Data Lake, DevOps and cloud-based large scale data applications with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment, contributing to the architecture, design, and implementation of Data products in the Aviation Industry, including migration from Synapse to Azure Data Lake. This role involves hands-on coding, mentoring junior staff and collaboration with multi-disciplined teams to achieve project objectives.
Responsibilities
Architect, design, develop, test and maintain high-performance, large-scale, complex data architectures, which support data integration (batch and real-time, ETL and ELT patterns from heterogeneous data systems: APIs and platforms), storage (data lakes, warehouses, data lake houses, etc), processing, orchestration and infrastructure. Ensuring the scalability, reliability, and performance of data systems, focusing on Databricks and Azure.
Contribute to detailed design, architectural discussions, and customer requirements sessions.
Actively participate in the design, development, and testing of big data products..
Construct and fine-tune Apache Spark jobs and clusters within the Databricks platform.
Migrate out of Azure Synapse to Azure Data Lake or other technologies.
Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive).
Design and implement data models and schemas that support efficient data processing and analytics.
Design and develop clear, maintainable code with automated testing using Pytest, unittest, integration tests, performance tests, regression tests, etc.
Collaborating with cross-functional teams and Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components meeting product deliverables.
Evaluating and implementing new technologies and tools to improve data integration, data processing, storage and analysis.
Evaluate, design, implement and maintain data governance solutions: cataloging, lineage, data quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns.
Continuously monitor and fine-tune workloads and clusters to achieve optimal performance.
Provide guidance and mentorship to junior team members, sharing knowledge and best practices.
Maintain clear and comprehensive documentation of the solutions, configurations, and best practices implemented.
Promote and enforce best practices in data engineering, data governance, and data quality.
Ensure data quality and accuracy.
Design, Implement and maintain data security and privacy measures.
Be an active member of an Agile team, participating in all ceremonies and continuous improvement activities, being able to work independently as well as collaboratively.
Requisitos:
- Must have a full-time Bachelor's degree in Computer Science or similar
- At least 5 years of experience as a data engineer with strong expertise in Databricks, Azure, DevOps, or other hyperscalers.
- 5+ years of experience with Azure DevOps, GitHub.
- Proven experience delivering large scale projects and products for Data and Analytics, as a data engineer, including migrations.
- Following certifications:Databricks Certified Associate Developer for Apache Spark
- Databricks Certified Data Engineer Associate
- Microsoft Certified: Azure Fundamentals
- Microsoft Certified: Azure Data Engineer Associate
- Microsoft Exam: Designing and Implementing Microsoft DevOps Solutions (nice to have)
- Strong programming Skills in one or more languages such as Python (must have), Scala, and proficiency in writing efficient and optimized code for data integration, migration, storage, processing and manipulation.
- Strong understanding and experience with SQL and writing advanced SQL queries.
- Thorough understanding of big data principles, techniques, and best practices.
- Strong experience with scalable and distributed Data Processing Technologies such as Spark/PySpark (must have: experience with Azure Databricks), DBT and Kafka, to be able to handle large volumes of data.
- Solid Databricks development experience with significant Python, PySpark, Spark SQL, Pandas, NumPy in Azure environment.
- Strong experience in designing and implementing efficient ELT/ETL processes in Azure and Databricks and using open source solutions being able to develop custom integration solutions as needed.
- Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming.
- Expertise in data cleansing, transformation, and validation.
- Proficiency with Relational Databases (Oracle, SQL Server, MySQL, Postgres, or similar) and NonSQL Databases (MongoDB or Table).
- Good understanding of Data Modeling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions.
- Strong experience in designing and implementing Data Warehousing, data lake and data lake house, solutions in Azure and Databricks.
- Good experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).
- Strong understanding of the software development lifecycle (SDLC), especially Agile methodologies.
- Strong knowledge of SDLC tools and technologies Azure DevOps and GitHub, including project management software (Jira, Azure Boards or similar), source code management (GitHub, Azure Repos or similar), CI/CD system (GitHub actions, Azure Pipelines, Jenkins or similar) and binary repository manager (Azure Artifacts or similar).
- Strong understanding of DevOps principles, including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform, ARM including hands-on experience), configuration management, automated testing, performance tuning and cost management and optimization.
- Strong knowledge in cloud computing specifically in Microsoft Azure services related to data and analytics, such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake,
- Azure Stream Analytics, SQL Server, Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database, etc.
- Experience in Orchestration using technologies like Databricks workflows and Apache Airflow.
- Strong knowledge of data structures and algorithms and good software engineering practices.
- Proven experience migrating from Azure Synapse to Azure Data Lake, or other technologies.
- Strong analytical skills to identify and address technical issues, performance bottlenecks, and system failures.
- Proficiency in debugging and troubleshooting issues in complex data and analytics environments and pipelines.
- Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent.
- Experience with BI solutions including PowerBI is a plus.
- Strong written and verbal communication skills to collaborate and articulate complex situations concisely with cross-functional teams, including business users, data architects, DevOps engineers, data analysts, data scientists, developers, and operations teams.
- Ability to document processes, procedures, and deployment configurations.
- Understanding of security practices, including network security groups, Azure Active Directory, encryption, and compliance standards.
- Ability to implement security controls and best practices within data and analytics solutions, including proficient knowledge and working experience on various cloud security vulnerabilities and ways to mitigate them.
- Self-motivated with the ability to work well in a team, and experienced in mentoring and coaching different members of the team.
- A willingness to stay updated with the latest services, Data Engineering trends, and best practices in the field.
- Comfortable with picking up new technologies independently and working in a rapidly changing environment with ambiguous requirements.
- Care about architecture, observability, testing, and building reliable infrastructure and data pipelines.
Benefícios
- Meal voucher/Food
- Medical Assistance - NotreDame Intermédica | Hapvida: No waiting period in the first month of admission. The company subsidizes 100% of the monthly fee for the holder and 50% for each dependent included. The plan includes co-participation for exams, consultations and emergency rooms.
- Dental Plan - Amil
- Life insurance
- Daycare Assistance
- Agreements with Partner Companies
- Training Program
- Sesc
- Gympass/Wellhub