
SnappTrip is looking for a Data Engineer to join our growing engineering team and contribute to the design, development, and operation of our data platform. In this role, you will work on production-grade pipelines and services that support analytics and business use cases across the company. You will be expected to operate independently on well-scoped problems, collaborate closely with senior engineers, and help improve the reliability, performance, and maintainability of our data systems.
Key Responsibilities:
Build and Operate Pipelines:
Design, implement, and maintain reliable data pipelines using tools such as Airflow, dbt, and Spark, ensuring data correctness and timely delivery.
Streaming and CDC Integration:
Work with Kafka Connect and Debezium to build and maintain change data capture (CDC) and streaming pipelines.
Query and Analytics Enablement:
Develop and optimize data models and queries consumed via Trino and downstream tools such as Metabase.
System Monitoring and Troubleshooting:
Monitor pipelines and services, investigate failures or performance regressions, and resolve issues in Linux-based production environments.
Collaboration and Delivery:
Collaborate with backend engineers, analysts, and stakeholders to translate requirements into maintainable technical solutions.
Documentation and Best Practices:
Contribute to technical documentation, data models, and operational runbooks; follow and help refine engineering best practices.
Qualifications:
Experience:
3–5 years of experience as a Data Engineer or in a closely related backend/platform role.
Core Skills:
Strong SQL skills for data modeling, transformations, and performance optimization.
Hands-on experience with dbt for transformations and analytics engineering.
Experience working with Trino or similar distributed SQL engines.
Practical experience with Spark for batch or large-scale data processing.
Experience with Kafka Connect and Debezium for streaming and CDC pipelines.
Programming:
Proficiency in one or more of Scala, Go, or Python, with experience writing production-quality code.
Infrastructure & OS:
Comfortable working in Linux environments; familiarity with deployment, logging, and operational debugging.
Orchestration & BI:
Experience with Airflow for workflow orchestration.
Familiarity with BI and analytics tools such as Metabase.
Soft Skills:
Strong problem-solving and debugging skills.
Ability to work independently on assigned tasks while collaborating effectively within a team.
Clear written and verbal communication skills.
Adaptability in a fast-paced, evolving technical environment.
ثبت مشکل و تخلف آگهی
ارسال رزومه برای اسنپ تریپ