Description: (Full time contractor) Location - Malaysia
Role Overview
We are seeking experienced Data Engineers to support the build-out and maintenance of modular data pipelines and analytics infrastructure as part of an AI transformation program for a major telecommunications client in Malaysia. The role will focus on enabling data flow automation, improving pipeline efficiency, and ensuring scalable integration with analytics and AI workloads hosted on Azure. You will work alongside data scientists, DevOps engineers, and domain experts to operationalize data solutions across multiple environments (sandbox, staging, and production) and contribute to the robustness and cost-effectiveness of our data platform.
Required Qualifications:
- Key Responsibilities
- Design, develop, and maintain data pipelines using Airflow
- Build modular and incremental data flows that allow efficient refresh cycles and minimize redundant runs
- Collaborate with cross-functional teams to integrate AI/ML models into production data environments
- Develop ETL/ELT workflows to ingest and transform data from multiple telecom and operational systems
- Ensure data quality, version control, and governance through consistent documentation and reusable components
- Participate in troubleshooting and optimization of data processes to ensure reliability, scalability, and performance
- Contribute to domain logic validation in partnership with business and analytics teams
- Required Skills & Experience
- 3–7 years of hands-on experience in data engineering or related roles.
Preferred Qualifications:
- Working knowledge of Azure data ecosystem (Data Factory, AzureML, Synapse, Blob Storage, Azure SQL, etc.)
- Familiarity with AWS services (S3, ECS, RDS, EC2)
- Experience designing incremental / modular data pipelines and integrating APIs or external data sources
- Strong understanding of data governance, versioning, and access management practices
- Excellent problem-solving skills, structured thinking, and strong communication abilities
Education:
• Strong proficiency in Python, SQL, and modern data frameworks (e.g., Airflow, Spark)