Bimeh Bazar started its journey in [2016] with the aim of creating a great experience in online insurance consultation, comparison, and purchase. We are looking for minds that can contribute to the growth of our company and guide it in the right direction. We want to collaborate with individuals who are driven by achieving accomplishments they can be proud of and value clear and respectful human interactions.
Our Data Team Collaborates to make it happen, we have an exciting opportunity for you to join us as a crucial part of our team as a Data Engineer.
Responsibilities:
- Build the infrastructure required for optimal Extraction, Transformation, and Loading of (un)structured data from a wide variety of data sources
- Data Acquisition and Extraction from various sources using provided API or other solutions like web scrapping
- Building data pipeline systems in event-driven or streaming architectures
- Deployment and management of systems used for data processing and analytics
- Building analytics tools and infrastructure that utilize data pipelines (as backbone) and data lakes/warehouses/lakehouses
Requirements:
- At least Bachelor in computer engineering or computer science
- 2+ years experience of working as a data engineer or other data related positions
- 3+ years experience in Python and Git
- Strong proficiency in database design and data pipeline architecture
- Experience with system architecture using micro-services paradigm
- Experience with relational SQL and NoSQL databases, including Postgres, MySQL, MSSQL, Elasticsearch, and MongoDB
- Experience with Apache Airflow to schedule periodic tasks utilizing reproducible and stateful DAGs
- Experience with Docker and Docker Compose to package, ship, and deploy services
- Experience with message brokers such as Kafka and RabbitMQ
- Experience with OLAP databases such as Clickhouse to manage and serve optimum data warehouses
- Familiarity with developing APIs using gRPC
- Familiarity with CI/CD pipelines using GitLabCI or GitHub Actions
- Familiarity with caching tools such as Redis/Valkey/Dragonfly and Memcached
- Familiarity with CDC tools such as Debezium and Airbyte
Nice to have:
- Familiarity with orchestration platforms such as Docker Swarm or Kubernetes
- Familiarity with data processing systems such as Apache Spark
- Familiarity with stream-processing systems such as Spark-Streaming
- Familiarity with data version control and continuous ML
- Familiarity with AI Model Lifecycle Management paradigms and systems such as MLflow
Benefits:
- Comprehensive health insurance
- Snapp credit (cab, food, pay)
- Legal and mental health counseling services
- Insurance purchase subsidy
- Loan