Snowflake Data Engineer

Posted on February 12, 2025

Apply Now

Job Description

  • We are looking for a Snowflake Data Engineer to start within a week to 10 days.
  • Contract duration – 3 months
  • Work timing – 1.00 – 10.00 p.m
  • Remote working
  • We are looking for a consultant who is strong is Snowflake.
  • Job Title: Snowflake Data Engineer
  • Experience: 4+ Years
  • Location remote
  • Employment Type: [Full-time/Contract]
  • Job Description
  • We are seeking a skilled Snowflake Data Engineer with a strong background in designing and implementing scalable data solutions. The ideal candidate will have experience working with Snowflake as a cloud data platform and a solid understanding of data modelling, ETL processes, and performance optimization.
  • Key Responsibilities
  • Snowflake Development:
  • Design and implement data pipelines and workflows on Snowflake.
  • Develop and optimize Snowflake databases, schemas, and tables.
  • Manage Snowflake features like virtual warehouses, shared data, and materialized views.
  • Data Integration:
  • Build and maintain ETL/ELT pipelines using tools like Talend, Informatica, Matillion, or equivalent.
  • Integrate data from various sources such as APIs, databases, and flat files.
  • Performance Optimization:
  • Monitor and optimize query performance in Snowflake.
  • Implement partitioning, clustering, and caching strategies to enhance performance.
  • Data Modeling:
  • Design logical and physical data models based on business requirements.
  • Ensure data accuracy, consistency, and security across pipelines.
  • Collaboration:
  • Work closely with data analysts, business stakeholders, and other engineers to deliver data solutions.
  • Document processes, workflows, and architecture.
  • Required Skills
  • Technical Proficiency:
  • 4+ years of experience in data engineering with at least 2 years working on Snowflake.
  • Strong experience with SQL for querying and scripting.
  • Knowledge of data warehousing concepts and methodologies.
  • Cloud Expertise:
  • Familiarity with cloud platforms such as AWS, Azure, or GCP.
  • Hands-on experience with Snowflake utilities like SnowSQL and SnowPipe.
  • ETL/ELT Tools:
  • Experience with ETL tools like Matillion, Talend, or Informatica.
  • Scripting experience in Python, Java, or Scala for custom ETL pipelines.
  • Other Tools:
  • Proficiency with tools like DBT (Data Build Tool) is a plus.
  • Experience with CI/CD pipelines for deploying data solutions.
  • Soft Skills:
  • Strong problem-solving and analytical abilities.
  • Excellent communication and teamwork skills.
  • Preferred Qualifications
  • SnowPro Certification or equivalent credentials.
  • Experience in Agile/Scrum development methodologies.
  • Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).

Required Skills

etl sql data warehousing aws azure or gcp.