Data Engineer

  • Kuala Lumpur, Malaysia
  • Full-Time
  • On-Site

Job Description:

Data Engineer

About the Role
We are seeking an experienced Data Engineer to join the technology team. This role is pivotal in building and managing the robust data pipelines and platforms that are the foundation for our advanced analytics and AI solutions.

Key Responsibilities

  • Architect, develop, and maintain scalable batch and streaming data pipelines utilizing the Google Cloud Platform (GCP), including BigQuery, Dataflow, Dataproc, Composer, Dataform, and Cloud Functions.
  • Design and execute processes to load, transform, and optimize data within BigQuery to ensure high performance for analytics and reporting.
  • Integrate data from diverse and complex sources, including external APIs, various databases, and flat files.
  • Play a key role in data migration initiatives, assisting in the transition from legacy systems such as Oracle and MicroStrategy to the new cloud environment.
  • Implement and enforce best practices for data quality, data governance, and security compliance across all platforms.
  • Collaborate effectively with data analysts and business teams to understand requirements and provide the necessary data support for reporting.

Requirements

  • A minimum of 3-5 years of professional experience in data engineering or complex ETL development.
  • Deep, hands-on experience with the GCP Data Stack (must have: BigQuery, Dataflow, Composer, Dataproc).
  • Solid, practical programming skills in both SQL and Python.
  • A strong understanding of data modelling concepts and database performance optimization techniques.
  • A proactive and curious mindset, with a strong willingness to learn and contribute to large-scale migration projects.
  • Familiarity or prior experience with the Azure Data Stack is a significant plus.


How to Apply

To express your interest in this role, please submit your detailed resume and a summary of your relevant project experience.