بیت24
بیت24

Data Engineer

Tehran/ Gisha
Full Time
44 hours a week
-
Loan -Bonus -Military Service Option -Health insurance -Parking space -Flexible working hours -Learning stipends -Game room -Lunch -Snacks -Resting space -Breakfast -Occasional packages and gifts
51 - 200 employees
Finance / Investment
Iranian company dealing with Iranian and foreign customers
1398
Privately held
توضیحات بیشتر

key Requirements

5 years experience in similar position
Python - Advanced
GIT - Advanced
MongoDB - Intermediate
MQL - Intermediate
Gitlab - Intermediate

Job Description

About the Role

We are seeking a Mid-Level / Senior Data Engineer to design, implement, and maintain scalable data pipelines and storage solutions that enable company-wide data-driven decision-making. You'll manage real-time analytics systems, collaborate with cross-functional teams, and ensure efficient, reliable data flow. 

 

Technical Skills

  • Programming Languages: Python, Java or Scala
  • Database Technologies: SQL, NoSQL (e.g., MongoDB, Cassandra)
  • Data Processing Frameworks: Apache Spark, Hadoop
  • Data Warehousing: Amazon Redshift, Google BigQuery, Snowflake
  • ETL Tools: Apache Airflow, Apache NiFi, Talend, Informatica
  • Cloud Platforms: AWS, Azure, Google Cloud Platform
  • Version Control: Git, GitHub, GitLab

 

Responsibilities:  

  • Design, develop, and maintain scalable data pipelines using Apache Spark and other big data technologies.

·         Build and maintain data architectures on Hadoop or similar distributed file systems.

·         Collaborate with cross-functional teams to identify, design, and implement data-driven solutions to complex business problems.

·         Optimize data systems for maximum performance and scalability.

·         Develop and manage real-time analytics systems, ensuring their reliability, performance, and maintenance.                       

·         Propose and refine data architecture to meet evolving business needs.

·         Collaborate with Business Intelligence, Ventures, and Data Science teams to ensure their data requirements are met.

·         Monitor and troubleshoot data services, resolving any issues that arise.

·         Set up real-time analytics solutions tailored to specific services and business demands.

·         Ensure highly efficient data pipelines by identifying and fixing performance bottlenecks.

·         Design, implement and maintain data infrastructure to ensure steady and undisrupted data flow.

 

Job requirements:      

·         Bachelor's or Master's degree in Computer Engineering/Science or equivalent experience.

  • 2+ years of experience in data engineering or a related field.

·         Expertise in designing and maintaining scalable data pipelines and big data systems.

·         Proficiency in Hadoop ecosystem (HDFS, Yarn, Hive, Spark).

·         Hands-on experience with Kafka and Zookeeper for data streaming and coordination.

·         Strong programming skills in Python, Java, Scala, or Go (minimum 2 years of experience).

·         Familiarity with monitoring systems such as Grafana, Prometheus, and Exporters.

·         Experience working with Linux, virtualization, Docker, and Kubernetes.

·         Proven experience in setting up and maintaining real-time analytics and big data systems.

·         Hands-on experience with big data technologies such as Pig, Kafka, and NoSQL databases.

  • Strong communication skills and ability to work collaboratively in a team environment.  
  • Excellent problem-solving skills and attention to detail.

Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI, Metabase).

Job Requirements

Age
28 - 45 Years Old
Gender
Men / Women
Software
Python| Advanced MQL| Intermediate GIT| Advanced MongoDB| Intermediate Gitlab| Intermediate

ثبت مشکل و تخلف آگهی

ارسال رزومه برای بیت24