Data Architect

Posted on February 10, 2025

Apply Now

Job Description

  • Job Title: Data Architect
  • Location: Mahadevpura – Bangalore
  • Exp : 10+ Years
  • Data Integration:
  • Design and implement data integration workflows to connect disparate systems (e.g., APIs, ETL/ELT tools, and data lakes).
  • Work with stakeholders to understand data requirements and ensure integration aligns with business objectives.
  • Maintain and enhance existing integration pipelines for efficiency and reliability.
  • Data Engineering:
  • Develop, test, and deploy scalable and efficient data pipelines using industry-standard tools and programming languages.
  • Optimize data storage, retrieval, and processing workflows to meet performance requirements.
  • Monitor and troubleshoot data pipelines to ensure consistent data delivery.
  • Collaboration & Stakeholder Management:
  • Partner with data analysts, data scientists, and business teams to provide clean, usable datasets.
  • Collaborate with IT and software development teams to ensure smooth deployment of data systems.
  • Technology & Tools Implementation:
  • Evaluate and integrate data management tools (ETL platforms, cloud services, and database systems).
  • Ensure compliance with data governance policies, privacy regulations, and security protocols.
  • Documentation & Reporting:
  • Create and maintain comprehensive documentation for data integration and engineering processes.
  • Provide regular status updates and reports to stakeholders on data system performance and issues.
  • Required Skills & Qualifications:
  • Education:
  • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field.
  • Technical Skills:
  • Proficiency in ETL/ELT tools (e.g., Talend, Informatica, Apache NiFi).
  • Strong programming skills in Python, SQL, Scala, or Java.
  • Experience with cloud platforms (AWS, Azure, GCP) and services (e.g., S3, Redshift, BigQuery).
  • Knowledge of databases (SQL and NoSQL) and data warehouse systems.
  • Familiarity with tools like Apache Kafka, Apache Airflow, or similar orchestration platforms.
  • Soft Skills:
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Attention to detail with a commitment to quality and accuracy.
  • Preferred Qualifications:
  • Experience in big data technologies (Hadoop, Spark).
  • Knowledge of data governance and compliance standards (e.g., GDPR, CCPA).
  • Certification in cloud technologies or data engineering tools.

Required Skills

apis etl/elt tools and data lakes talend informatica apache nifi python sql scala or java databases (sql and nosql)