اسنپ مارکت
اسنپ مارکت

Senior Data Engineer

Tehran/ Zaferanieh
Full Time
Saturday to Wednesday
-
-
501 - 1000 employees
Internet Provider / E-commerce / Online Services
Iranian company dealing only with Iranian entities
2018
snappmarket
Privately held
توضیحات بیشتر

key Requirements

5 years experience in similar position
MySql - Intermediate
Java - Intermediate
Python - Intermediate
Kafka - Intermediate
Prometheus - Intermediate
Gerafana - Intermediate

Job Description

About the Role

We are looking for a Senior Data Engineer to join our fast-paced data team. In this role, you’ll design, build, and maintain robust and scalable data pipelines and platforms. You’ll work closely with data scientists, analysts, and product teams to ensure efficient data flows, high-quality datasets, and low-latency infrastructure.

This is a hands-on technical role for someone passionate about data systems, stream processing, batch workflows, and big data infrastructure.

 

What You’ll Do:

  • Design and maintain real-time and batch data pipelines using Kafka, Spark, and Airflow.
  • Architect and optimize ClickHouse as the core analytical database for high-performance queries.
  • Build reliable ETL/ELT workflows to move data from MySQL and other sources into ClickHouse.
  • Develop CDC (Change Data Capture) pipelines using Kafka Connect, Debezium, and Schema Registry.
  • Implement Medallion Architecture (Bronze, Silver, Gold layers) with Delta Lake for data versioning and governance.
  • Use MinIO for object storage and distributed data access in a cloud-free setup.
  • Write clean, efficient, and maintainable code in Python or Java.
  • Build reusable datasets for analytics and self-service dashboards via Metabase.
  • Monitor system performance and data pipeline health with Grafana and Prometheus.
  • Collaborate with BI and data science teams to ensure data availability and quality.

 

Required Qualifications:

  • 4+ years of experience as a data engineer or in a similar role.
  • Strong proficiency in Python and/or Java.
  • Solid hands-on experience with ClickHouse: schema design, query optimization, and production usage.
  • Deep knowledge of Kafka, Kafka Connect, and Schema Registry.
  • Experience with ETL/ELT pipelines and CDC (Change Data Capture) implementations.
  • Proven experience with Apache Airflow for orchestration.
  • Strong SQL skills and experience with MySQL or similar RDBMS.
  • Experience building self-service data layers using Metabase.
  • Hands-on experience with MinIO as an object storage layer.
  • Familiarity with Medallion Architecture and Delta Lake for data modeling and versioning.
  • Experience with Spark or PySpark.
  • Familiar with Docker, Docker Compose, and Kubernetes for deployment.
  • Working knowledge of monitoring/alerting using Grafana and Prometheus.

 

Nice to Have:

  • Experience with MLOps tools and practices (e.g., model versioning, serving, monitoring).
  • Understanding of Data Warehouse design (e.g., Star Schema, Dimensional Modeling).
  • Involvement in software engineering projects with clean code, testing, version control, and collaboration.
  • Exposure to CI/CD for data pipelines and infrastructure as code.
  • Experience processing front-end logs or event data at scale.
  • Interest in building internal platforms and tooling to support a larger engineering org.
  • Experience with Data Platform deployment: setting up tools, automation scripts, observability stacks, and configuration best practices.
  • Familiarity with Elasticsearch, Neo4j, and other databases/engines used for search, graph, or specialized workloads.
  • Experience with Ansible and Terraform for infrastructure automation and configuration management.

Job Requirements

Age
25 - 40 Years Old
Gender
Men / Women
Software
Kafka| Intermediate Python| Intermediate MySql| Intermediate Java| Intermediate Prometheus| Intermediate Gerafana| Intermediate

ثبت مشکل و تخلف آگهی

ارسال رزومه برای اسنپ مارکت