···
Log in / Register
DataOps Engineer
Negotiable Salary
Indeed
Full-time
Onsite
No experience limit
No degree limit
PA239-Parada / Museo Militar, Santiago, Región Metropolitana, Chile
Favourites
New tab
Share
Some content was automatically translatedView Original
Description

At BC Tecnología, we are an IT consulting firm focused on providing services to clients in sectors such as finance, insurance, retail, and government. This role is part of our Data & Analytics initiatives, where we design and operate modern, reliable data pipelines to support strategic decision-making. The DataOps Engineer will work in a collaborative environment, integrating Data Engineering, Analytics, and DevOps to deliver scalable and secure cloud-based solutions. You will participate in projects for high-profile clients, promoting best practices in data quality, traceability, and operational efficiency, with a focus on innovation and continuous improvement. Apply without intermediaries through Get on Board. ### **Main Responsibilities** * Administer and optimize the Databricks environment, managing clusters, notebooks, and automated jobs for productive data flows. * Design and implement efficient and scalable data pipelines in Azure Data Factory and/or Databricks Workflows, using PySpark as the primary language. * Define and execute data quality automation strategies, including validations, alerts, and continuous monitoring. * Develop infrastructure as code using Terraform, ensuring versioning and controlled deployment of cloud components. * Integrate CI/CD processes for data pipelines, including unit testing and automated deployments. * Collaborate with Data Engineering and Analytics teams to improve data flow, availability, and consistency. * Monitor cloud environment costs, performance, and security, implementing proactive improvements. ### **Role Description and Required Skills** We are looking for a DataOps Engineer with at least 3 years of experience in DataOps or Data Engineering, and proven experience managing Databricks in production environments. Candidates must have skills in data quality automation, development and maintenance of PySpark data pipelines, and experience with Terraform for data infrastructure. Experience integrating DevOps processes (CI/CD) for data deployments, and knowledge of data security/governance in Azure environments will be valued. The ideal candidate is proactive, results-oriented, has strong communication and teamwork skills, and can translate business requirements into robust, scalable technical solutions. ### **Desirable Requirements** Advanced knowledge of Azure Monitor and Databricks Metrics, experience with data quality testing tools such as Great Expectations, and familiarity with Observability, auditing, and compliance practices. Experience in cloud cost optimization and designing Data Governance solutions. Relevant certifications in Azure or Data Engineering are a plus. ### **Benefits** At BC Tecnología, we promote a collaborative work environment that values commitment and continuous learning. Our culture emphasizes professional growth through team integration and knowledge sharing. The hybrid work model we offer, based in Las Condes, allows combining the flexibility of remote work with in-person collaboration, facilitating a better work-life balance and dynamic work experience. You will participate in innovative projects with high-profile clients across diverse sectors, within an environment that fosters inclusion, respect, and technical and professional development. GETONBRD Job ID: 57335 **Remote work policy** ---------------------- **Hybrid** This job is performed partly from home and partly at the office in Santiago (Chile).

Source:  indeed View original post
Sofía Muñoz
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.