Data Engineering

Posted on May 22, 2025

Apply Now

Job Description

  • We are urgently looking for a person to implement the Data Lakehouse Architecture on Google Cloud Platform for our client.
  • Please find below the JD and all relevant details for the same. Kindly share the profiles at the earliest for quick shortlisting.
  • Experience: 4+ yrs in Data Engineering and 2+ yrs in Google Cloud Platform
  • Contract Duration : Minimum 3 months
  • Location: Remote
  • Job Description:
  • Role Summary:
  • Implement the data lakehouse architecture on Google Cloud Platform that will serve as the foundation for ABC's knowledge graph and agentic AI systems.
  • Key Responsibilities:
  • Design and build data pipelines for ingesting diverse data sources into the lakehouse
  • Implement ETL processes using Dataflow and DBT for data transformations
  • Set up BigQuery and Google Cloud Storage for structured and unstructured data
  • Configure Dataplex for metadata management and data governance
  • Ensure data quality and lineage throughout the platform
  • Optimize data architecture for performance and scalability
  • Requirements:
  • Bachelor's or Master's degree in Computer Science, Data Engineering, or related field
  • 4+ years experience as a data engineer, with 2+ years focused on GCP
  • Deep knowledge of BigQuery, Dataflow, Cloud Storage, and Dataplex
  • Experience with DBT and data transformation workflows
  • Background in data modeling and schema design
  • Proficiency in Python, SQL, and cloud infrastructure as code.

Required Skills

4+ years experience as a data engineer with 2+ years focused on gcp deep knowledge of bigquery dataflow cloud storage and dataplex experience with dbt and data transformation workflows background in data modeling and schema design