This job position is within the Data Science department of Pooya Company, which specializes in AI-based software solutions for various industries, including banking and security. Our team consists of specialized, creative, and motivated individuals working in a dynamic and collaborative environment.
We are looking for an experienced and enthusiastic Data Engineer to join the Data Science department. You will play a key role in designing, developing, and maintaining the data infrastructure and Data Pipelines that provide the necessary data for real-time analytics, fraud detection, reporting, and model improvement. If you are interested in working with large-scale data, real-time systems, and advanced technologies, this opportunity is for you.
Key Responsibilities:
- Design, build, test, and maintain scalable and reliable real-time and batch data pipelines.
- Integrate data from various internal and external sources (such as transaction logs, databases, APIs).
- Ensure data quality, integrity, availability, and security throughout the data lifecycle.
- Implement and optimize stream processing solutions using Apache Flink and Apache Spark Streaming.
- Work with various types of databases including SQL, NoSQL, Time-Series, and Graph.
- Develop and implement monitoring, alerting, and logging systems for data pipelines.
- Collaborate closely with other parts of the team to understand requirements and provide suitable data solutions.
- Optimize the performance of data pipelines and queries.
- Research and evaluate new technologies and tools in the data engineering field.
- Requirements and Qualifications:
- BS or MS degree in Computer Science, Software Engineering, Information Technology, or related fields.
- Minimum 3 years of professional experience as a Data Engineer.
- Strong programming experience with Python and Java.
- High proficiency in SQL and data modeling.
- Hands-on experience with the Big Data ecosystem (e.g., Apache Spark, and Hadoop).
- Experience with stream processing platforms (Kafka, Flink, and Spark Streaming).
- Familiarity with various database types (Relational, and NoSQL).
- Strong understanding of ETL/ELT concepts and Data Warehousing.
- Ability to troubleshoot complex data systems and solve problems.
- Good communication and teamwork skills.