Senior Data Engineer (Python, DataStage, Perl, Java)*

Posted on April 8, 2025

Apply Now

Job Description

  • *Role: Senior Data Engineer (Python, DataStage, Perl, Java)*
  • Type: Remote
  • Experience: 5+ Years
  • Contract Duration: 6 Months
  • *BGV Mandatory - Details in JD*
  • JD - Senior Data Engineer (DataStage)
  • We are seeking a highly skilled and motivated Senior Data Engineer with over 5 years of experience, primarily in Python development, and a strong working knowledge of IBM DataStage, Perl, and Java. The ideal candidate will play a key role in designing, developing, and maintaining data pipelines and ETL solutions, ensuring data integrity, scalability, and performance across systems.
  • Key Responsibilities:
  • Design, develop, and optimize scalable data pipelines using Python as the primary language.
  • Build and maintain ETL processes with IBM DataStage, integrating diverse data sources.
  • Maintain and enhance legacy scripts written in Perl and Java, ensuring smooth migration or integration with modern systems.
  • Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality solutions.
  • Troubleshoot data issues and ensure high data quality and integrity across systems.
  • Develop unit and integration tests, participate in code reviews, and follow best practices for code documentation and deployment.
  • Required Skills:
  • 5+ years of professional experience in software/data engineering.
  • Strong expertise in Python for data processing, scripting, and automation.
  • Experience with IBM DataStage for ETL development and maintenance.
  • Hands-on experience with Perl for scripting and maintenance of legacy systems.
  • Working knowledge of Java, particularly for backend data-related services.
  • Proficient in working with relational databases (e.g., Oracle, SQL Server, PostgreSQL).
  • Familiarity with version control tools like Git and CI/CD pipelines.
  • Strong problem-solving skills and the ability to work independently or as part of a team.
  • Preferred Qualifications:
  • Experience with cloud platforms (AWS, Azure, GCP) is a plus.
  • Knowledge of data warehousing and big data technologies.
  • Familiarity with Agile methodologies.
  • Educational Requirements:
  • Bachelor�s or Master�s degree in Computer Science, Engineering, or a related field.
  • Background verification will be required
  • Mandatory checks for BGV-
  • Education Check
  • Employment Check
  • Database Check
  • Criminal Check
  • Address Check
  • CIBIL check
  • Reference Check
  • ID Check
  • CV Validation
  • Gap Check (from education to employment)
  • Credit Check
  • Drug test
  • Note- Candidates need to visit the client location for laptop pickup.

Required Skills

cloud platforms (aws azure gcp) is a plus. strong expertise in python for data processing scripting and automation. experience with ibm datastage for etl development and maintenance.