···
Log in / Register
Senior DevOps Engineer
Indeed
Full-time
Onsite
No experience limit
No degree limit
PA239-Parada / Museo Militar, Santiago, Región Metropolitana, Chile
Favourites
Share
Some content was automatically translatedView Original
Description

Job Summary: We are looking for a Senior DevOps Engineer to design, build, and operate DevOps capabilities supporting the Data Platform, with emphasis on DataOps and Software Engineering. Key Highlights: 1. Key role in building and evolving a modern data platform 2. Focus on automation and building robust solutions 3. Collaborative work with high technical standards and impact **Company Description:** We are over 88,000 people who work every day toward our firm Purpose — Simplify and Enjoy Life More. We operate in 9 countries and consist of five major brands across diverse industries: Falabella Retail, Sodimac, Banco Falabella, Tottus, and Mallplaza. Each of these brands shapes who we are, and together—as One Team—we strive daily to reinvent ourselves and exceed our customers’ expectations. A team full of dreams that makes things happen. We dare to launch and innovate, take risks, and create opportunities enabling us to stay at the forefront—driving us to reinvent ourselves and deliver the best shopping experience at every touchpoint with us. **Job Mission:** Provision required resources or infrastructure for our solutions, ensuring continuous integration and deployment. **Job Responsibilities:** If you have a curious mind, are passionate about automation, and enjoy building robust software and data solutions, this challenge is for you! At Falabella Technology, we seek a Senior DevOps Engineer to join our Data Platform team, playing a key role in building and evolving our modern data platform (Lakehouse) on GCP—enabling Analytics, BI, Data Engineering, and Data Science teams. We are Falabella, a diverse team of over 100,000 collaborators, comprised of major brands including Falabella Retail, Sodimac, Banco Falabella, Seguros Falabella, Tottus, Mallplaza, Open Plaza, and Linio—with presence in 7 Latin American countries, plus offices in China and India. **Every day is different:** we work collaboratively, with high technical standards and impact focus. If complex challenges and building strategic platforms for the business motivate you, this is your opportunity to make things happen. Role Purpose The main challenge of this role is to design, build, and operate DevOps capabilities supporting the Data Platform—ensuring continuous deployment, reliability, security, and cost efficiency—with strong emphasis on DataOps and Software Engineering. Key Responsibilities Design, implement, and maintain CI/CD pipelines in GitLab/GitHub for infrastructure, software, and data components. Develop software and automations—primarily in Python—to support DataOps processes, validations, orchestration, and internal tooling. Create, deploy, and maintain infrastructure-as-code using Terraform—primarily on GCP—considering high availability, security, and cost optimization. Operate and support containerized workloads on Kubernetes, performing advanced troubleshooting on distributed services. Enable and operate DataOps workflows for the Lakehouse platform (dbt, data pipelines, environment management). Implement and evolve observability and monitoring for infrastructure, services, and data pipelines using Prometheus, Grafana, and Datadog. Work closely with Data Engineering, Analytics, BI, and Data Science teams—understanding their needs and enabling platform capabilities. Define and promote DevOps, DataOps, and Software Engineering best practices—driving automation, quality, and reliability. Candidate Profile We seek an analytical, automation-oriented individual with an engineering mindset—combining solid DevOps experience with real software development capability and understanding of data platforms. **Desirable:** Experience working with modern data platforms (Lakehouse, dbt, Iceberg, Data Lakes). Knowledge of data and analytics architectures. Experience in cloud cost optimization and security. **Requirements:** Solid experience as a DevOps Engineer in cloud environments (ideally GCP) Minimum 3 years of experience working with a CI/CD tool such as GitLab, Jenkins, etc. Experience working with IaC tools such as Terraform, Pulumi, CloudFormation, etc. Basic knowledge of networking (VPC, LB, NAT, etc.) Proficiency in at least one scripting language such as Bash, Python, JavaScript, or another Practical knowledge of DataOps and operation of data pipelines. Experience administering Kubernetes cluster(s) (AKS, GKE, EKS, etc.) Fluent Git usage (branching, rebase, conflict resolution) Availability to work in hybrid mode. Intermediate English proficiency **Offer Conditions:

Source:  indeed View original post
Sofía Muñoz
Indeed · HR

Company

Indeed
Sofía Muñoz
Indeed · HR
Similar jobs

Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.