Design, build, and maintain data pipelines and ETL/ELT workflows across GCP and on-prem environments. 3+ years of experience in data engineering, backend data systems, or cybersecurity data processing. * Strong Python skills and experience with pandas, PySpark, or Dask for large-scale data manipulation. * Proven experience with data orchestration and transformation frameworks (Airflow, dbt, or Dagster). * Experience integrating heterogeneous data sources (APIs, CSV, JSON, XML, or event streams). * Experience with containerized environments (Docker, Kubernetes) and infrastructure automation (Terraform or Pulumi). * Experience working with graph databases (Neo4j, ArangoDB) or ontology-based data modeling. * Experience in multi-cloud or hybrid setups (GCP, Azure, on-prem). * Freedom to design and shape a modern, secure data platform from the ground up. * Competitive salary and benefits tailored to your experience.
mehr