Data Lakehouse Architecture on Google Cloud Platform
Posted on May 19, 2025
Job Description
- Data Lakehouse Architecture on Google Cloud Platform
- Remote
- Contract
- Exp-4
- JD-
- Role Summary:
- Implement the data lakehouse architecture on Google Cloud Platform that will serve as the foundation for ABC's knowledge graph and agentic AI systems.
- Key Responsibilities:
- Design and build data pipelines for ingesting diverse data sources into the lakehouse
- Implement ETL processes using Dataflow and DBT for data transformations
- Set up BigQuery and Google Cloud Storage for structured and unstructured data
- Configure Dataplex for metadata management and data governance
- Ensure data quality and lineage throughout the platform
- Optimize data architecture for performance and scalability
- Requirements:
- Bachelor's or Master's degree in Computer Science, Data Engineering, or related field
- 4+ years experience as a data engineer, with 2+ years focused on GCP
- Deep knowledge of BigQuery, Dataflow, Cloud Storage, and Dataplex
- Experience with DBT and data transformation workflows
- Background in data modeling and schema design
- Proficiency in Python, SQL, and cloud infrastructure as code
Required Skills
data lake
architecture
architecture