We are looking for a talented and experienced Data Engineer to join our team! If you enjoy working with real-time data streaming, have a knack for solving problems quickly, and are passionate about extracting meaningful insights from data, we want to hear from you.
Responsibilities:
- Design, develop, and maintain scalable data pipelines for real time and batch processing.
- Implement robust data streaming architectures to support business analytics and reporting requirements.
- Optimize data workflows and ETL processes for improved performance and reliability.
- Ensure data quality, integrity, and security throughout all stages of data processing.
- Monitor and troubleshoot data pipelines, addressing any issues in a timely manner.
- Solid understanding of distributed computing, data modeling, and database design principles.
Requirements:
- Strong proficiency in Apache Kafka for real time data streaming and messaging.
- Extensive hands-on experience with Click House for data storage and analytics.
- Advanced proficiency in Python for scripting, data manipulation, and automation.
- Experience with workflow management tools such as Apache Airflow.
- Familiarity with containerization technologies (Docker, Kubernetes) is a plus.
- Proven experience (3+ years) working as a Data Engineer or similar role.
- Excellent problem-solving skills with a strong attention to detail.
- Effective communication skills and ability to collaborate in a team environment.