




Job Summary: We are seeking a Functional Data Quality QA to validate data quality and consistency in our GCP DataLake, ensuring information reliability within Big Data agile teams. Key Highlights: 1. Ensure information reliability in Big Data environments. 2. Validate data quality, consistency, and functional behavior in GCP. 3. Actively participate in Big Data agile teams. At Stefanini LATAM, we are looking for a Functional Data Quality QA to join our team and contribute to validating data quality, consistency, and functional behavior within our corporate DataLake on Google Cloud Platform (GCP)! If you are passionate about ensuring information reliability, working with data, and participating in Big Data agile teams, this opportunity is for you! Location: Santiago de Chile. Work Mode: Hybrid, 2–3 days per week. Main Responsibilities: - Validate data ingestion processes across different DataLake layers: Raw, Curated, and Semantic. - Verify integrity, completeness, duplication, and consistency across multiple data sources. - Validate business rules, transformations, and calculations executed by data pipelines. - Review SQL query results and outputs generated by transformation processes. - Execute functional tests on data pipelines in Big Data environments. - Review logs, services, jobs, and tasks executed within processing pipelines. - Validate results in cloud environments associated with the DataLake on Google Cloud Platform (GCP). - Analyze data profiling results and data quality metrics. - Design and execute functional test cases focused on data. - Ensure compliance with functional criteria defined by business, data domains, and Data Governance. - Actively participate in Big Data agile teams and functional certification of user stories. - Collaborate with Data Engineers, Data Owners, and Data Governance teams. Technical Requirements and Competencies: - Experience in functional testing focused on data. - Strong SQL knowledge for validating integrity, calculations, and business rules. - Experience validating data ingestion, transformations, and queries. - Familiarity with Big Data ecosystems, DataLakes, and data pipelines. - Experience reviewing logs, jobs, and processes within data pipelines. - Proficiency with Atlassian tools such as Jira, TM4J / Xray, and Bitbucket. - Knowledge of Google Cloud Platform (GCP) applied to data ecosystems. Desirable Knowledge: - Analytical data models: EST, DDL, and partitioning. - Concepts of Data Governance, data lineage, and metadata. - Experience with data domains and quality rules defined by GDD. - Understanding of the data lifecycle in GCP environments: QA – Production. - Experience in regulatory or compliance environments within the financial sector. - Experience working under agile methodologies applied to data delivery.-Requirements- Minimum Education: Secondary Education (CH) / Technical 3 years of experience Age: between 24 and 60 years Keywords: calidad, quality, qa
