AWS Data Engineer

Posted on January 12, 2026

Apply Now

Job Description

AWS Data Engineer

Overview

Work Location: Remote

Experience 6+ years

kindly prioritize this role and start sharing relevant profiles at the earliest and would appreciate your support in sourcing quality profiles as per the JD mentioned below.

Remote 6 months and more

Senior Associate AWS Data Engineer 6 to 10 Years

Manager AWS Data Engineer 10+ Years

Key Responsibilities

  • Design, develop, and deploy robust, scalable, realtime AWSbased data solutions aligned with PwC’s high standards and client business objectives.
  • Architect and maintain secure, compliant data lake and warehouse solutions utilizing AWS services such as S3, Redshift, EMR, Lambda, Kinesis, and more.
  • Build efficient ETL/ELT workflows that support diverse, complex data sources, and enable timely data availability for analytics and reporting.
  • Lead the creation and optimization of eventdriven, streaming data pipelines that support advanced analytics, AI/ML models, and decisionmaking processes.
  • Partner with PwC consultants, data scientists, BI engineers, and client teams to translate business problems into effective technical solutions.
  • Ensure adherence to PwC’s rigorous data governance, security, and privacy policies throughout the solution lifecycle.
  • Implement automation and DevOps best practices within data engineering pipelines to enhance reliability, scalability, and deployment velocity.
  • Stay abreast of AWS innovations and industry trends to continually bring new opportunities to PwC’s data practice and clients.

Preferred Qualifications

  • Proven experience building enterpriselevel data engineering solutions on AWS, supporting largescale data initiatives.
  • Proficiency in Python and handson experience with data orchestration tools (Apache Airflow, AWS Step Functions, or equivalent).
  • Solid handson experience of distributed data frameworks and streaming technologies such as Apache Spark, Kafka, and Hive.
  • Handson expertise with AWS ecosystem including S3, Lambda, Kinesis, Redshift, EMR, Glue and SQS.
  • Strong data modeling skills and experience designing ETL/ELT pipelines for batch and realtime ingestion.
  • Familiarity with CI/CD pipelines, version control, and infrastructureascode tools (CloudFormation, Terraform).
  • Exceptional analytical skills with a proactive, problemsolving approach and a strong customer orientation.
  • Excellent communication and collaboration skills suited for a consulting environment involving clients and internal teams.
  • Bachelor’s or higher degree in Computer Science, Information Systems, Engineering or related fields; AWS certification(s) highly desirable.

Required Skills

etl/elt ai/ml

Clarification Board

Your Clarifications
"Send your Job Related Query - you'll get a reply soon."