Python Developer

Posted on June 10, 2025

Apply Now

Job Description

  • Technical Skills
  • ? BigQuery & SQL: Expert proficiency�writing advanced queries, creating UDFs, managing partitioned and clustered tables, tuning cost/performance.
  • ? Python: Strong ability to write idiomatic, well?tested Python code for:
  • ? Consuming external APIs (e.g., python?binance)
  • ? Batch and streaming data pipelines (using google?cloud?bigquery, Apache Beam)
  • ? Deployable serverless functions/services (Cloud Functions, Cloud Run)
  • ? GCP Ecosystem:
  • ? Cloud Storage (storing raw/dump files; staging buckets)
  • ? Pub/Sub (loosely coupling ingestion?processing)
  • ? Dataflow (Apache Beam pipelines for complex/real?time transforms)
  • ? Cloud Composer (Airflow?based orchestration of multi?step DAGs)
  • ? Cloud Functions / Cloud Run (serverless compute for ingestion, transformation, and live trading)
  • ? Cloud Scheduler (cron?like triggers for functions and scheduled queries)
  • ? Secret Manager (secure storage of API keys and credentials)
  • ? IAM & VPC (defining least?privilege roles, networking fundamentals for private GKE clusters or Serverless VPC connectors)
  • ? Data Modeling / ETL Frameworks: ? Expertise in dimensional modeling (star/snowflake) and normalization, implemented on BigQuery.
  • ? Familiarity with ETL/ELT toolslike Apache Beam (Dataflow) or Airflow (Composer).
  • ? Infrastructure?as?Code & DevOps:
  • ? Experience with Terraform (preferred) or Deployment Manager for provisioning GCP resources.
  • ? Familiarity with containerization (Docker) and Git?based CI/CD workflows (e.g., Cloud Build, GitHub Actions).
  • ? Monitoring & Observability:
  • ? Basic proficiency setting up Cloud Logging, Cloud Monitoring alerts, and creating dashboards to track pipeline health, job failures, ingestion latency, and trading anomalies. Soft Skills
  • Key Responsibilities
  • BigQuery Implementation
  • ? Design, implement, and maintain data warehouses and data marts in BigQuery.
  • ? Write performant SQL queries and continually optimize for both cost and latency.
  • ? Manage partitioning, clustering, and table?sharding strategies to support both batch and streaming workloads. Data Ingestion & Pipeline Orchestration
  • ? Build and maintain real?time ingestion pipelines (e.g., ingesting Binance API data via Python) into BigQuery.
  • ? Implement Pub/Sub topics (or equivalent decoupling layer) to buffer incoming market data before transformation.
  • ? Create and maintain Cloud Scheduler jobs to trigger ingestion, transformation, and live? trading functions at precise intervals (e.g., every 15 seconds or once per minute). GCP Infrastructure & DevOps

Required Skills

python python